Import
Automatically version and manage your prompts from your codebase, or create and edit them in the UI.
What is Prompt Management?
Helicone’s prompt management provides a seamless way for users to version, track and optimize their prompts used in their AI applications.
Example: A prompt template designed for a course generation application.
Why manage prompts in Helicone?
Once you set up prompts in Helicone, your incoming requests will be matched to a helicone-prompt-id
, allowing you to:
- version and track iterations of your prompt over time.
- maintain a dataset of inputs and outputs for each prompt.
- generate content using predefined prompts through our Generate API.
Quick Start
Prerequisites
Please set up Helicone in proxy mode using one of the methods in the Starter Guide.
Not sure if proxy is for you? We’ve created a guide to explain the difference between Helicone Proxy vs Helicone Async integration.
Create prompt templates
As you modify your prompt in code, Helicone automatically tracks the new version and maintains a record of the old prompt. Additionally, a dataset of input/output keys is preserved for each version.
Install the package
Import hpf
Add `hpf` and identify input variables
By prefixing your prompt with hpf
and enclosing your input variables in an additional {}
, it allows Helicone to easily detect your prompt and inputs. We’ve designed for minimum code change to keep it as easy as possible to use Prompts.
Static Prompts with hpstatic
In addition to hpf
, Helicone provides hpstatic
for creating static prompts that don’t change between requests. This is useful for system prompts or other constant text that you don’t want to be treated as variable input.
To use hpstatic
, import it along with hpf
:
Then, you can use it like this:
The hpstatic
function wraps the entire text in <helicone-prompt-static>
tags, indicating to Helicone that this part of the prompt should not be treated as variable input.
Change input name
To rename your input or have a custom input, change the key-value pair in the passed dictionary to the string formatter function:
Assign an id to your prompt
Assign a Helicone-Prompt-Id
header to your LLM request.
Assigning an id allows us to associate your prompt with future versions of your prompt, and automatically manage versions on your behalf.
Install the package
Import hpf
Add `hpf` and identify input variables
By prefixing your prompt with hpf
and enclosing your input variables in an additional {}
, it allows Helicone to easily detect your prompt and inputs. We’ve designed for minimum code change to keep it as easy as possible to use Prompts.
Static Prompts with hpstatic
In addition to hpf
, Helicone provides hpstatic
for creating static prompts that don’t change between requests. This is useful for system prompts or other constant text that you don’t want to be treated as variable input.
To use hpstatic
, import it along with hpf
:
Then, you can use it like this:
The hpstatic
function wraps the entire text in <helicone-prompt-static>
tags, indicating to Helicone that this part of the prompt should not be treated as variable input.
Change input name
To rename your input or have a custom input, change the key-value pair in the passed dictionary to the string formatter function:
Assign an id to your prompt
Assign a Helicone-Prompt-Id
header to your LLM request.
Assigning an id allows us to associate your prompt with future versions of your prompt, and automatically manage versions on your behalf.
Install the package
Import hpf
Add `hpf` and identify input variables
By prefixing your prompt with hpf
and providing your input variables as keyword arguments, it allows Helicone to easily detect your prompt and inputs. We’ve designed for minimum code change to keep it as easy as possible to use Prompts.
Static Prompts with hpstatic
In addition to hpf
, Helicone provides hpstatic
for creating static prompts that don’t change between requests. This is useful for system prompts or other constant text that you don’t want to be treated as variable input.
To use hpstatic
, import it along with hpf
:
Then, you can use it like this:
The hpstatic
function wraps the entire text in <helicone-prompt-static>
tags, indicating to Helicone that this part of the prompt should not be treated as variable input.
Change input name
To rename your input or have a custom input, use a different keyword argument name:
Assign an id to your prompt
Assign a Helicone-Prompt-Id
header to your LLM request.
Assigning an id allows us to associate your prompt with future versions of your prompt, and automatically manage versions on your behalf.
Currently, we only support packages for TypeScript and JavaScript for easy integration. For other users, we recommend manually implementing input variables as follows:
Identify the input variable, and create a key name.
Given the below string, let’s say a secret agent
is our inputted variable. Let’s name our key character
.
Add helicone-prompt-input tags
Add <helicone-prompt-input key="<INPUT_ID>">
before your input variable, and </helicone-prompt-input>
after, where INPUT_ID
is your key name. In this example, key="character"
:
Static Prompts
For static prompts, you can manually wrap static parts of your prompt in <helicone-prompt-static>
tags:
This tells Helicone that the first part of the prompt is static and should not be treated as variable input.
Tag your prompt
In order to securely ensure that we know when and how to parse and categorize your prompt, please append the following header to your LLM request.
The id must not include any spaces or special characters. Underscores and dashes are acceptable.
Make a request
You just created a prompt template! Helicone will now keep track of all of your inputs for you. Rest assured that all of the helicone-prompt-input tags will be removed before your prompt is sent to your LLM.
Put it together
Let’s say we have an app that generates a short story, where users are able to input their own character
. For example, the prompt is “Write a story about a secret agent”, where the character
is “a secret agent”.
Run experiments
Once you’ve set up prompt management, you can use Helicone’s Experiments feature to test and improve your prompts.
Local testing
Many times in development, you may want to test your prompt locally before deploying it to production and you don’t want Helicone to track new prompt versions.
To do this, you can set the Helicone-Prompt-Mode
header to testing
in your LLM request. This will prevent Helicone from tracking new prompt versions.
To do this, you can set the Helicone-Prompt-Mode
header to testing
in your LLM request. This will prevent Helicone from tracking new prompt versions.
To do this, you can set the Helicone-Prompt-Mode
header to testing
in your LLM request. This will prevent Helicone from tracking new prompt versions.
FAQ
Was this page helpful?