Return a conversation response which continues the conversation based on the input message and deployment conversation id (if exists).
REQUIRED | KEY | TYPE | DESCRIPTION |
---|---|---|---|
Yes | deploymentId | str | The unique identifier to a deployment created under the project. |
Yes | deploymentToken | str | A token used to authenticate access to deployments created in this project. This token is only authorized to predict on deployments in this project, so it is safe to embed this model inside of an application or website. |
Yes | message | str | A message from the user |
No | deploymentConversationId | str | The unique identifier of a deployment conversation to continue. If not specified, a new one will be created. |
No | externalSessionId | str | The user supplied unique identifier of a deployment conversation to continue. If specified, we will use this instead of a internal deployment conversation id. |
No | llmName | str | Name of the specific LLM backend to use to power the chat experience |
No | numCompletionTokens | int | Default for maximum number of tokens for chat answers |
No | systemMessage | str | The generative LLM system message |
No | temperature | float | The generative LLM temperature |
No | filterKeyValues | dict | A dictionary mapping column names to a list of values to restrict the retrived search results. |
No | searchScoreCutoff | float | Cutoff for the document retriever score. Matching search results below this score will be ignored. |
No | chatConfig | dict | A dictionary specifiying the query chat config override. |
No | attachments | dict | A dictionary of binary data to use to answer the queries. |
KEY | TYPE | DESCRIPTION | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
success | Boolean | true if the call succeeded, false if there was an error | |||||||||
NlpChatResponse |
|
TYPE | WHEN |
---|---|
DataNotFoundError |
|
DataNotFoundError |
|
DataNotFoundError |
|