"Anthropic" (Service Connection)
Use the Anthropic API with the Wolfram Language.
Connecting & Authenticating
ServiceConnect["Anthropic"] creates a connection to the Anthropic API. If a previously saved connection can be found, it will be used; otherwise, a new authentication request will be launched.
Requests
ServiceExecute["Anthropic","request",params] sends a request to the OpenAI API using parameters params. The following gives possible requests.
"Completion" — create text completion for a given prompt
"Prompt" | the prompt for which to generate completions |
"MaxTokens" | Automatic | maximum number of tokens to generate | |
"Metadata" | Automatic | metadata about the request | |
"Model" | Automatic | name of the model to use | |
"StopTokens" | None | up to four strings where the API will stop generating further tokens | |
"Stream" | False | return the result as server-sent events | |
"Temperature" | Automatic | sampling temperature (between 0 and 1) | |
"TopProbabilities" | Automatic | sample only among the k highest-probability classes | |
"TotalProbabilityCutoff" | None | sample among the most probable classes with an accumulated probability of at least p (nucleus sampling) |
"Chat" — create a response for the given chat conversation
"Messages" | a list of messages in the conversation |
"MaxTokens" | Automatic | maximum number of tokens to generate | |
"Metadata" | Automatic | metadata about the request | |
"Model" | Automatic | name of the model to use | |
"StopTokens" | None | up to four strings where the API will stop generating further tokens | |
"Stream" | False | return the result as server-sent events | |
"Temperature" | Automatic | sampling temperature (between 0 and 1) | |
"TopProbabilities" | Automatic | sample only among the k highest-probability classes | |
"TotalProbabilityCutoff" | None | sample among the most probable classes with an accumulated probability of at least p (nucleus sampling) |