ChatObject
represents an ongoing conversation with a remote service.
ChatObject[init]
creates a new chat using the initialization parameters init.
ChatObject[…][prop]
extracts the property prop from the object.
Details and Options




- ChatObject stores a full conversation together with the message metadata.
- The initialization init can take the following parameters:
-
"string" static text LLMPrompt["name"] a repository prompt StringTemplate[…] templated text TemplateObject[…] template for creating a text {prompt1,…} a list of prompts {msg1,…} a list of messages - Template objects are automatically converted to strings via TemplateObject[…][].
- When the initialization is a list of messages, each message must be an association with the following keys:
-
"Role" String role of the participant "Content" String content of the message "Timestamp" DateObject message timestamp (optional) - The following options can be specified:
-
Authentication Automatic explicit user ID and API key LLMEvaluator $LLMEvaluator LLM configuration to use - LLMEvaluator can be set to an LLMConfiguration object or an association with any of the following keys:
-
"Model" base model "Temperature" sampling temperature "TotalProbabilityCutoff" sampling probability cutoff (nucleus sampling) "Prompts" prompts "PromptDelimiter" delimiter to use between prompts "StopTokens" tokens on which to stop generation "Tools" list of LLMTool objects to use "ToolPrompt" prompt for specifying tool format "ToolRequestParser" function for parsing tool requests "ToolResponseString" function for serializing tool responses - Valid forms of "Model" include:
-
name named model {service,name} named model from service <"Service"service,"Name"name,"Task"task > fully specified model - The generated text is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMEvaluator:
-
"Temperature"t 1 sample using a positive temperature t "TopProbabilityCutoff"p 1 sample among the most probable choices with an accumulated probability of at least p (nucleus sampling) - Multiple prompts are separated by the "PromptDelimiter" property of the LLMEvaluator.
- Possible values for Authentication are:
-
Automatic choose the authentication scheme automatically Environment check for a key in the environment variables SystemCredential check for a key in the system keychain ServiceObject[…] inherit the authentication from a service object assoc provide explicit key and user ID - With AuthenticationAutomatic, the function checks the variable "OPENAI_API_KEY" in Environment and SystemCredential; otherwise, it uses ServiceConnect["OpenAI"].
- When using Authenticationassoc, assoc can contain the following keys:
-
"ID" user identity "APIKey" API key used to authenticate - Possible values for prop include:
-
"ChatID" the unique ID of the conversation "Messages" a list of exchanged messages "Properties" a list of all the possible properties "Usage" cumulative API usage (calls, tokens, …) {prop1,…} a list of properties

Examples
open allclose all
Wolfram Research (2023), ChatObject, Wolfram Language function, https://reference.wolfram.com/language/ref/ChatObject.html.
Text
Wolfram Research (2023), ChatObject, Wolfram Language function, https://reference.wolfram.com/language/ref/ChatObject.html.
CMS
Wolfram Language. 2023. "ChatObject." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/ChatObject.html.
APA
Wolfram Language. (2023). ChatObject. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/ChatObject.html