ChatLLMPredictionArguments

Prediction arguments for the CHAT_LLM problem type

KEY TYPE Description
numCompletionTokens int Default for maximum number of tokens for chat answers.
llmName str Name of the specific LLM backend to use to power the chat experience.
ignoreDocuments bool If True, will ignore any documents and search results, and only use the messages to generate a response.
temperature float The generative LLM temperature.
systemMessage str The generative LLM system message.
searchScoreCutoff float Cutoff for the document retriever score. Matching search results below this score will be ignored.