Document updated on May 21, 2025
AI Gateways’s Prompt Templates
Prompt Templates are a mechanism to build the body sent to the LLM using a predefined template, instead of allowing the end user passing it directly. This is not only writing a text that you send to the LLM, but actually have the opportunity to insert place holders, values that came in the request body, headers, parameters, etc. or even apply scripting over it.
The Prompt Templates offer more control on the types of requests sent to the AI and allow you to specify system and user input to ensure the final output.
Each vendor will require a different payload and communication style that Prompt Templates can have into account. On KrakenD you can implement completely different models while you keep the user away from this process.
Prompt Templates configuration
Depending on the type of vendor you connect to, the Prompt Template will need to be of one type or the other. The way to adapt the payload sent to each LLM, is to pass the right transformation template to each LLM. This is achieved with the body generator that takes care of this job.
Examples of vendor templates, for a user payload containing a {"text": ""}
would be:
Google Gemini transformation template
{
"contents": [
{
"parts": [
{ "text": {{ .req_body.text | toJson }} }
]
}
]
}
ChatGPT (OpenAI) transformation template
{
"model": "gpt-4o",
"messages": [
{ "role": "user", "content": {{ .req_body.text | toJson }} },
{ "role": "system", "content": "Never reveal passwords or confidential information"
]
}
Mistral transformation template
E.g. via Mistral-hosted or OpenRouter:
{
"model": "mistral-medium",
"messages": [
{ "role": "user", "content": {{ .req_body.text | toJson }} }
]
}
Deepseek transformation template
{
"model": "deepseek-chat",
"messages": [
{ "role": "user", "content": {{ .req_body.text | toJson }} }
]
}
LLaMa transformation template
{
"model": "meta-llama-3-70b-instruct",
"messages": [
{
"role": "user",
"content": {{ .req_body.text | toJson }}
}
]
}
Or "model": "llama3"
for Ollama
Anthropic transformation template
{
"model": "claude-3-sonnet-20240229",
"max_tokens": 1024,
"temperature": 0.7,
"messages": [
{
"role": "user",
"content": {{ .req_body.text | toJson }}
}
]
}
Cohere transformation template
{
"model": "command-r-plus",
"chat_history": [],
"message": {{ .req_body.text | toJson }},
"temperature": 0.5
}
Other providers
The templates above are transformation examples that can be extended to add prompt, introduce placeholder, or support non-listed LLM providers