LLMFunction

LLMFunction[prompt]

represents a template for a large language model (LLM) prompt.

LLMFunction[{prompt1,prompt2,}]

represents a combination of multiple prompts.

LLMFunction[prompt,form]

includes the interpreter form to apply to the response.

LLMFunction[][params]

give the LLM service response for prompt applied to parameters params.

Details and Options

  • An LLMFunction can be used to generate text using a large language model (LLM). It can create content, complete sentences, extract information and more.
  • LLMFunction requires external service authentication, billing and internet connectivity.
  • The prompti support the following values:
  • "prompt"static text
    LLMPrompt["name"]repository prompt
    StringTemplate[]templated text
    TemplateObject[]template for creating text
  • LLMFunction supports the following options:
  • InsertionFunctionTextStringfunction or format to apply before inserting expressions
    CombinerFunctionStringJoinfunction to apply to combine pieces within a prompt
    AuthenticationAutomaticexplicit user ID and API key
    LLMEvaluator$LLMEvaluatorLLM configuration to use
  • LLMEvaluator can be set to an LLMConfiguration object or an association with any of the following keys:
  • "MaxTokens"maximum amount of tokens to generate
    "Model"base model
    "PromptDelimiter"string to insert between prompts
    "Prompts"initial prompts
    "StopTokens"tokens on which to stop generation
    "Temperature"sampling temperature
    "ToolMethod"method to use for tool calling
    "Tools"list of LLMTool objects to make available
    "TopProbabilities"sampling classes cutoff
    "TotalProbabilityCutoff"sampling probability cutoff (nucleus sampling)
  • Valid forms of "Model" include:
  • namenamed model
    {service,name}named model from service
    <|"Service"service,"Name"name,"Task"task|>fully specified model
  • The generated text is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMEvaluator:
  • "Temperature"tAutomaticsample using a positive temperature t
    "TopProbabilities"kAutomaticsample only among the k highest-probability classes
    "TotalProbabilityCutoff"pAutomaticsample among the most probable choices with an accumulated probability of at least p (nucleus sampling)
  • The setting "Temperature"Automatic resolves to zero temperature within LLMFunction. The other parameters use the default for the specified "Model".
  • Multiple prompts are separated by the "PromptDelimiter" property of the LLMEvaluator.
  • Possible values for Authentication are:
  • Automaticchoose the authentication scheme automatically
    Environmentcheck for a key in the environment variables
    SystemCredentialcheck for a key in the system keychain
    ServiceObject[]inherit the authentication from a service object
    assocprovide explicit key and user ID
  • With AuthenticationAutomatic, the function checks the variable "OPENAI_API_KEY" in Environment and SystemCredential; otherwise, it uses ServiceConnect["OpenAI"].
  • When using Authenticationassoc, assoc can contain the following keys:
  • "ID"user identity
    "APIKey"API key used to authenticate
  • LLMFunction uses machine learning. Its methods, training sets and biases included therein may change and yield varied results in different versions of the Wolfram Language.

Examples

open allclose all

Basic Examples  (3)

Create a function for getting cooking instructions:

Create a helper tool:

Use the function:

Create a function that returns a city as an Entity:

Use the function:

Scope  (2)

Use named parameters:

Apply it using an Association:

Set a default for a parameter:

Properties & Relations  (1)

LLMFunction with no parameters sends the prompt directly to the LLM:

This is equivalent to LLMSynthesize with zero temperature:

Wolfram Research (2023), LLMFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMFunction.html.

Text

Wolfram Research (2023), LLMFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMFunction.html.

CMS

Wolfram Language. 2023. "LLMFunction." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/LLMFunction.html.

APA

Wolfram Language. (2023). LLMFunction. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/LLMFunction.html

BibTeX

@misc{reference.wolfram_2024_llmfunction, author="Wolfram Research", title="{LLMFunction}", year="2023", howpublished="\url{https://reference.wolfram.com/language/ref/LLMFunction.html}", note=[Accessed: 27-April-2024 ]}

BibLaTeX

@online{reference.wolfram_2024_llmfunction, organization={Wolfram Research}, title={LLMFunction}, year={2023}, url={https://reference.wolfram.com/language/ref/LLMFunction.html}, note=[Accessed: 27-April-2024 ]}