Prediction arguments for the CHAT_LLM problem type
| KEY | TYPE | Description |
|---|---|---|
| temperature | float | The generative LLM temperature. |
| searchScoreCutoff | float | Cutoff for the document retriever score. Matching search results below this score will be ignored. |
| systemMessage | str | The generative LLM system message. |
| numCompletionTokens | int | Default for maximum number of tokens for chat answers. |
| llmName | str | Name of the specific LLM backend to use to power the chat experience. |
| ignoreDocuments | bool | If True, will ignore any documents and search results, and only use the messages to generate a response. |