"DeepSeek" (Service Connection)
       This service connection requires LLM access »
    
   Connecting & Authenticating
ServiceConnect["DeepSeek"] creates a connection to the DeepSeek API. If a previously saved connection can be found, it will be used; otherwise, a new authentication request will be launched.
    
   Requests
ServiceExecute["DeepSeek","request",params] sends a request to the DeepSeek API, using parameters params. The following gives possible requests.
    
    "TestConnection" — returns Success for working connection, Failure otherwise
Text
"Chat" — create a response for the given chat conversation
| "Messages" | (required) | a list of messages in the conversation, each given as an association with "Role" and "Content" keys | |
| "FrequencyPenalty" | Automatic | penalize tokens based on their existing frequency in the text so far | |
| "MaxTokens" | Automatic | maximum number of tokens to generate | |
| "Model" | Automatic | name of the model to use | |
| "N" | Automatic | number of chat completions to return | |
| "PresencePenalty" | Automatic | penalize new tokens based on whether they appear in the text so far | |
| "StopTokens" | Automatic | strings where the API will stop generating further tokens | |
| "Stream" | False | return the result as server-sent events | |
| "Temperature" | Automatic | sampling temperature | |
| "TotalProbabilityCutoff" | Automatic | an alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with the requested probability mass | 
Model Lists
"ChatModelList" — list models available for the "Chat" request
See Also
ServiceExecute ▪ ServiceConnect ▪ LLMFunction ▪ LLMSynthesize ▪ ChatEvaluate ▪ LLMConfiguration
Service Connections: AlephAlpha ▪ Anthropic ▪ Cohere ▪ GoogleGemini ▪ Groq ▪ MistralAI ▪ OpenAI ▪ TogetherAI