---
title: "LLMPromptGenerator"
language: "en"
type: "Symbol"
summary: "LLMPromptGenerator[f] represents a prompt generator that uses the function f. LLMPromptGenerator[f, inputspec] provides the specified inputspec to f."
keywords: 
- prompt generation
- prompt generator
- dynamic prompt generation
- system prompt generation
- message-based prompt
- dynamic prompt
- contextual prompt
- LLM
- LLM prompt function
- LLM input function
- LLM function
- prompt customization
- prompt templating
- conversation context
- chat history
- semantic prompt
- semantic search
- image prompt
- multimodal prompt
- text-to-image
- prompt output
- repository prompt
- static prompt
- generated prompt
- prompt specification
- input mapping
- association-based input
- LLM customization
- Wolfram LLM
- LLM prompt pipeline
- function-based prompt
- AI assistant input
- template prompt creation
- chat model input
- prompt programming
- structured LLM input
- LLM context function
- contextual generation
- LLM prompt logic
- Wolfram AI prompt
- prompt result
- multi-input prompt
- chat-enhanced prompt
- prompt framework
- retrieval-augmented generation
- RAG
- semantic retrieval
- retrieval prompt
- search-enhanced prompt
- document-based prompt
- external knowledge injection
- knowledge grounding
- retrieval pipeline
- RAG implementation
- LLM with search
canonical_url: "https://reference.wolfram.com/language/ref/LLMPromptGenerator.html"
source: "Wolfram Language Documentation"
related_guides: 
  - 
    title: "LLM-Related Functionality"
    link: "https://reference.wolfram.com/language/guide/LLMFunctions.en.md"
  - 
    title: "Natural Language Processing"
    link: "https://reference.wolfram.com/language/guide/NaturalLanguageProcessing.en.md"
related_functions: 
  - 
    title: "LLMSynthesize"
    link: "https://reference.wolfram.com/language/ref/LLMSynthesize.en.md"
  - 
    title: "LLMFunction"
    link: "https://reference.wolfram.com/language/ref/LLMFunction.en.md"
  - 
    title: "ChatEvaluate"
    link: "https://reference.wolfram.com/language/ref/ChatEvaluate.en.md"
  - 
    title: "SemanticSearch"
    link: "https://reference.wolfram.com/language/ref/SemanticSearch.en.md"
  - 
    title: "SemanticSearchIndex"
    link: "https://reference.wolfram.com/language/ref/SemanticSearchIndex.en.md"
  - 
    title: "LLMPrompt"
    link: "https://reference.wolfram.com/language/ref/LLMPrompt.en.md"
  - 
    title: "LLMTool"
    link: "https://reference.wolfram.com/language/ref/LLMTool.en.md"
---
[EXPERIMENTAL]

# LLMPromptGenerator

LLMPromptGenerator[f] represents a prompt generator that uses the function f.

LLMPromptGenerator[f, inputspec] provides the specified inputspec to f.

## Details

* ``LLMPromptGenerator`` is used to add message-dependent context to an LLM prompt.

* ``LLMPromptGenerator`` can be used in retrieval-augmented generation (RAG) workflows to construct prompts dynamically using retrieved documents, semantic indexes or other external knowledge sources.

* Possible values for ``inputspec`` are:

|                |                                         |
| -------------- | --------------------------------------- |
| "Input"        | the last user input (default)           |
| "Messages"     | the list of messages                    |
| "LLMEvaluator" | the current LLMConfiguration[…]         |
| "ChatObject"   | the whole conversation as ChatObject[…] |
| {spec1, …}     | an association of inputs                |

* The result of ``f`` has to be a valid prompt. Possible values include:

|                        |                                |
| ---------------------- | ------------------------------ |
| "text"                 | static text                    |
| LLMPrompt["name"]      | a repository prompt            |
| StringTemplate[…]      | templated text                 |
| TemplateObject[…]      | template for creating a prompt |
| Image[…]               | an image                       |
| SemanticSearchIndex[…] | a semantic search index        |
| {prompt1, …}           | a list of prompts              |

* Template objects are automatically converted to strings via ``TemplateObject[…][]``.

* A prompt created with ``TemplateObject`` can contain text and images.

* Not every LLM supports image input.

* If the result of ``f`` is not a string, a list of strings or an image, it will be converted using ``TextString``.

## Examples (7)

### Basic Examples (1)

Define a prompt generator that always prepends the current evaluation date and time:

```wl
In[1]:= gen = LLMPromptGenerator[{"Current date/time: ", DateString[]}&]

Out[1]=
LLMPromptGenerator[Association["InputSpecifications" -> {"Input"}, 
  "Function" -> ({"Current date/time: ", DateString[]} & ), "InputSelector" -> Lookup["Input"], 
  "Options" -> {}, "LLMPacletVersion" -> "1.2.10"]]
```

Test the generator:

```wl
In[2]:= gen["hello there"]

Out[2]= {"Current date/time: ", "Mon 24 Jun 2024 18:31:48"}
```

Use the generator in an interaction with an LLM:

```wl
In[3]:= LLMSynthesize["what day is today?", LLMEvaluator -> <|"Prompts" -> gen|>]

Out[3]= "Today is Monday, June 24, 2024."
```

### Scope (5)

#### Function (3)

Use a simple prompt:

```wl
In[1]:= LLMSynthesize["hi", LLMEvaluator -> <|"Prompts" -> LLMPromptGenerator@LLMPrompt["R2D2"]|>]

Out[1]= "Beep-beep!"
```

---

Use a function to define the tone of the answer at evaluation time based on the input:

```wl
In[1]:= gen = LLMPromptGenerator["Talk like a " <> First[StringCases[#, StartOfString ~~ "@" ~~ role : (LetterCharacter..) ~~ WordBoundary :> role], "pirate"]&]

Out[1]=
LLMPromptGenerator[Association["InputSpecifications" -> {"Input"}, 
  "Function" -> (StringJoin["Talk like a ", 
     First[StringCases[#1, StartOfString~~"@"~~role:LetterCharacter..~~WordBoundary :> role], 
      "pirate"]] & ), "InputSelector" -> Lookup["Input"], "Options" -> {}, 
  "LLMPacletVersion" -> "1.2.10"]]
```

Try with a cowboy role:

```wl
In[2]:= LLMSynthesize["@cowboy hi!", LLMEvaluator -> <|"Prompts" -> gen|>]

Out[2]= "Well, howdy there, partner! What brings ya to this neck of the digital woods?"
```

---

Define a generator on a ``SemanticSearchIndex`` :

```wl
In[1]:= gen = LLMPromptGenerator@CreateSemanticSearchIndex[WikipediaData["light"]]

Out[1]=
LLMPromptGenerator[Association["InputSpecifications" -> {"Input"}, 
  "Function" -> TemplateObject[{"Answer using this information:", "\n\n", "BEGIN CONTEXT", "\n\n", 
      TemplateSlot["Items", InsertionFunction -> 
        (StringRiffle[#1, Wolf ... "GeneratedAssetLocation" -> "LocalObject", "TagKeys" -> {}, "Version" -> 1, 
          "Hash" -> 103221403795781887489925846881430479734]], #1, "Items"]] & ), 
  "InputSelector" -> Lookup["Input"], "Options" -> {}, "LLMPacletVersion" -> "1.2.10"]]
```

Use it to add content semantically relevant to the query:

```wl
In[2]:= LLMSynthesize["What is the connection between light and Aphrodite in one sentence", LLMEvaluator -> <|"Prompts" -> gen|>]

Out[2]= "According to Empedocles, the goddess Aphrodite created the human eye from the four elements and lit a fire within it, enabling sight through the interaction of this light with external rays."
```

Compare with the uninformed LLM answer:

```wl
In[3]:= LLMSynthesize["What is the connection between light and Aphrodite in one sentence"]

Out[3]= "The connection between light and Aphrodite lies in her association with beauty and radiance, often symbolized by the light of the morning star (Venus), which is a celestial embodiment of the goddess."
```

#### Input Specifications (2)

Define a generator that echoes the calling function input (default argument):

```wl
In[1]:= LLMSynthesize["this is the input", "PromptText", LLMEvaluator -> <|"Prompts" -> LLMPromptGenerator[Function[Echo[#]; "the prompt"]]|>]

>> "this is the input"

Out[1]=
"the prompt

this is the input"
```

---

Give a custom input specification:

```wl
In[1]:=
LLMSynthesize["This is the input", "PromptText", LLMEvaluator -> <|"Prompts" -> LLMPromptGenerator[Function[StringRiffle[{"* User input is: \"" <> ToString[#Input] <> "\"", "* Current LLM model is: " <> #LLMEvaluator["Model"]["Name"]}, "
"]], {"LLMEvaluator", "Input"}]|>]

Out[1]=
"* User input is: \"This is the input\"
* Current LLM model is: gpt-4o

This is the input"
```

### Possible Issues (1)

A list of prompts does not evaluate when applied to the input:

```wl
In[1]:= LLMPromptGenerator[{"Current date/time: ", DateString[]}]["content"]

Out[1]=
Failure[LLMPromptGenerator, Association["MessageTemplate" :> LLMPromptGenerator::uneval, 
  "MessageParameters" :> {InputForm[{"Current date/time: ", "Thu 27 Jun 2024 15:36:27"}], 
    InputForm["content"]}]]
```

Wrap the list using ``TemplateObject`` or ``Function`` :

```wl
In[2]:= LLMSynthesize["content", "PromptText", LLMEvaluator -> {"Prompts" -> LLMPromptGenerator[TemplateObject[{"Current date/time: ", TemplateExpression@DateString[]}]]}]

Out[2]=
"Current date/time: 

Thu 27 Jun 2024 15:36:35

content"
```

## See Also

* [`LLMSynthesize`](https://reference.wolfram.com/language/ref/LLMSynthesize.en.md)
* [`LLMFunction`](https://reference.wolfram.com/language/ref/LLMFunction.en.md)
* [`ChatEvaluate`](https://reference.wolfram.com/language/ref/ChatEvaluate.en.md)
* [`SemanticSearch`](https://reference.wolfram.com/language/ref/SemanticSearch.en.md)
* [`SemanticSearchIndex`](https://reference.wolfram.com/language/ref/SemanticSearchIndex.en.md)
* [`LLMPrompt`](https://reference.wolfram.com/language/ref/LLMPrompt.en.md)
* [`LLMTool`](https://reference.wolfram.com/language/ref/LLMTool.en.md)

## Related Guides

* [LLM-Related Functionality](https://reference.wolfram.com/language/guide/LLMFunctions.en.md)
* [Natural Language Processing](https://reference.wolfram.com/language/guide/NaturalLanguageProcessing.en.md)

## History

* [Introduced in 2024 (14.1)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn141.en.md)