represents an ongoing conversation with a remote service.


creates a new chat using the initialization parameters init.


extracts the property prop from the object.

Details and Options

  • ChatObject stores a full conversation together with the message metadata.
  • The initialization init can take the following parameters:
  • "string"static text
    LLMPrompt["name"]a repository prompt
    StringTemplate[]templated text
    TemplateObject[]template for creating a text
    {prompt1,}a list of prompts
    {msg1,}a list of messages
  • Template objects are automatically converted to strings via TemplateObject[][].
  • When the initialization is a list of messages, each message must be an association with the following keys:
  • "Role"Stringrole of the participant
    "Content"Stringcontent of the message
    "Timestamp"DateObjectmessage timestamp (optional)
  • The following options can be specified:
  • AuthenticationAutomaticexplicit user ID and API key
    LLMEvaluator $LLMEvaluatorLLM configuration to use
  • LLMEvaluator can be set to an LLMConfiguration object or an association with any of the following keys:
  • "MaxTokens"maximum amount of tokens to generate
    "Model"base model
    "PromptDelimiter"string to insert between prompts
    "Prompts"initial prompts
    "StopTokens"tokens on which to stop generation
    "Temperature"sampling temperature
    "ToolMethod"method to use for tool calling
    "Tools"list of LLMTool objects to make available
    "TopProbabilities"sampling classes cutoff
    "TotalProbabilityCutoff"sampling probability cutoff (nucleus sampling)
  • Valid forms of "Model" include:
  • namenamed model
    {service,name}named model from service
    <|"Service"service,"Name"name,"Task"task|>fully specified model
  • The generated text is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMEvaluator:
  • "Temperature"tAutomaticsample using a positive temperature t
    "TopProbabilities"kAutomaticsample only among the k highest-probability classes
    "TotalProbabilityCutoff"pAutomaticsample among the most probable choices with an accumulated probability of at least p (nucleus sampling)
  • The Automatic value of these parameters uses the default for the specified "Model".
  • Multiple prompts are separated by the "PromptDelimiter" property of the LLMEvaluator.
  • Possible values for Authentication are:
  • Automaticchoose the authentication scheme automatically
    Environmentcheck for a key in the environment variables
    SystemCredentialcheck for a key in the system keychain
    ServiceObject[]inherit the authentication from a service object
    assocprovide explicit key and user ID
  • With AuthenticationAutomatic, the function checks the variable "OPENAI_API_KEY" in Environment and SystemCredential; otherwise, it uses ServiceConnect["OpenAI"].
  • When using Authenticationassoc, assoc can contain the following keys:
  • "ID"user identity
    "APIKey"API key used to authenticate
  • Possible values for prop include:
  • "ChatID"the unique ID of the conversation
    "Messages"a list of exchanged messages
    "Properties"a list of all the possible properties
    "Usage"cumulative API usage (calls, tokens, )
    {prop1,}a list of properties


open allclose all

Basic Examples  (1)

Create a new chat:

Add a message and a response to the conversation:

Get a list of all the messages:

Scope  (3)

Create an empty chat:

Create a chat with an initial prompt:

Create a chat from a list of messages:

Options  (1)

LLMEvaluator  (1)

Create a chat that uses a specific model:

Wolfram Research (2023), ChatObject, Wolfram Language function,


Wolfram Research (2023), ChatObject, Wolfram Language function,


Wolfram Language. 2023. "ChatObject." Wolfram Language & System Documentation Center. Wolfram Research.


Wolfram Language. (2023). ChatObject. Wolfram Language & System Documentation Center. Retrieved from


@misc{reference.wolfram_2024_chatobject, author="Wolfram Research", title="{ChatObject}", year="2023", howpublished="\url{}", note=[Accessed: 17-June-2024 ]}


@online{reference.wolfram_2024_chatobject, organization={Wolfram Research}, title={ChatObject}, year={2023}, url={}, note=[Accessed: 17-June-2024 ]}