ChatLLMPredictionArguments

Prediction arguments for the CHAT_LLM problem type

KEY TYPE Description
numCompletionTokens int Default for maximum number of tokens for chat answers.
temperature float The generative LLM temperature.
searchScoreCutoff float Cutoff for the document retriever score. Matching search results below this score will be ignored.
ignoreDocuments bool If True, will ignore any documents and search results, and only use the messages to generate a response.
systemMessage str The generative LLM system message.
llmName str Name of the specific LLM backend to use to power the chat experience.