Method
getChatResponseWithBinaryData POST
Copy POST

Return a chat response which continues the conversation based on the input messages and search results.

Arguments:

REQUIRED KEY TYPE DESCRIPTION
Yes deploymentToken str The deployment token to authenticate access to created deployments. This token is only authorized to predict on deployments in this project, so it is safe to embed this model inside of an application or website.
Yes deploymentId str The unique identifier to a deployment created under the project.
Yes messages list A list of chronologically ordered messages, starting with a user message and alternating sources. A message is a dict with attributes: is_user (bool): Whether the message is from the user. text (str): The message's text.
No llmName str Name of the specific LLM backend to use to power the chat experience
No numCompletionTokens int Default for maximum number of tokens for chat answers
No systemMessage str The generative LLM system message
No temperature float The generative LLM temperature
No filterKeyValues dict A dictionary mapping column names to a list of values to restrict the retrieved search results.
No searchScoreCutoff float Cutoff for the document retriever score. Matching search results below this score will be ignored.
No chatConfig dict A dictionary specifying the query chat config override.
No attachments dict A dictionary of binary data to use to answer the queries.
Note: The arguments for the API methods follow camelCase but for Python SDK underscore_case is followed.

Response:

KEY TYPE DESCRIPTION
success Boolean true if the call succeeded, false if there was an error
NlpChatResponse
KEY TYPE Description
deploymentConversationId str The unique identifier of the deployment conversation.
messages list The conversation messages in the chat.

Exceptions:

TYPE WHEN
DataNotFoundError

deploymentId is not found.

Language: