LLMConfiguration
represents a configuration for an LLM.
LLMConfiguration[propval]
creates a configuration based on $LLMEvaluator with the specified property set to val.
LLMConfiguration[<prop1->val1,prop2->val2,... >]
specifies several properties and values.
LLMConfiguration[LLMConfiguration[…],propspec]
creates a configuration based on an existing configuration.
Details



- LLMConfiguration objects can be used with functions such as LLMSynthesize, ChatObject and ChatEvaluate through the LLMEvaluator option.
- $LLMEvaluator is set to an LLMConfiguration.
- Supported properties of LLMConfiguration objects include:
-
"Model" base model "Temperature" sampling temperature "TotalProbabilityCutoff" sampling probability cutoff (nucleus sampling) "Prompts" initial prompts "PromptDelimiter" string to insert between prompts "StopTokens" tokens on which to stop generation "Tools" list of LLMTool objects to make available "ToolMethod" method to use for tool calling - Valid settings for "Model" include:
-
name named model {service,name} named model from service <"Service"service,"Name"name,"Task"task > fully-specified model - Text generated by an LLM is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMConfiguration:
-
"Temperature"t 1 sample using a positive temperature t "TopProbabilityCutoff"p 1 sample among the most probable choices with an accumulated probability of at least p (nucleus sampling) - The setting for "PromptDelimiter" determines how multiple prompts are joined.
- Valid settings for "ToolMethod" include:
-
"OpenAI" OpenAI's function mechanism "Textual" generic textual tool calling assoc specific textual prompting and parsing - Valid keys in assoc include:
-
"ToolPrompt" prompt specifying tool format "ToolRequestParser" function for parsing tool requests "ToolResponseInsertionFunction" function for serializing tool responses - The prompt specified by "ToolPrompt" is only used if at least one tool is specified.
- "ToolPrompt" can be a template, and is applied to an association containing all properties of the LLMConfiguration.
- "ToolRequestParser" specifies a function that takes the most recent completion from the LLM, and returns one of the following forms:
-
None no tool request {{start,end},LLMToolRequest[…]} tool request {{start,end},Failure[…]} invalid tool request - The pair of integers {start,end} indicates the character range within the completion string where the tool request appears.
Examples
Text
Wolfram Research (2023), LLMConfiguration, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMConfiguration.html.
CMS
Wolfram Language. 2023. "LLMConfiguration." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/LLMConfiguration.html.
APA
Wolfram Language. (2023). LLMConfiguration. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/LLMConfiguration.html