Import
Automatically version and manage your prompts from your codebase, or create and edit them in the UI.
What is Prompt Management?
Helicone’s prompt management provides a seamless way for users to version, track and optimize their prompts used in their AI applications.
Example: A prompt template designed for a course generation application.
Why manage prompts in Helicone?
Once you set up prompts in Helicone, your incoming requests will be matched to a helicone-prompt-id
, allowing you to:
- version and track iterations of your prompt over time.
- maintain a dataset of inputs and outputs for each prompt.
- generate content using predefined prompts through our Generate API.
Quick Start
Prerequisites
Please set up Helicone in proxy mode using one of the methods in the Starter Guide.
Not sure if proxy is for you? We’ve created a guide to explain the difference between Helicone Proxy vs Helicone Async integration.
Create prompt templates
As you modify your prompt in code, Helicone automatically tracks the new version and maintains a record of the old prompt. Additionally, a dataset of input/output keys is preserved for each version.
Install the package
Import hpf
Add `hpf` and identify input variables
By prefixing your prompt with hpf
and enclosing your input variables in an additional {}
, it allows Helicone to easily detect your prompt and inputs. We’ve designed for minimum code change to keep it as easy as possible to use Prompts.
Static Prompts with hpstatic
In addition to hpf
, Helicone provides hpstatic
for creating static prompts that don’t change between requests. This is useful for system prompts or other constant text that you don’t want to be treated as variable input.
To use hpstatic
, import it along with hpf
:
Then, you can use it like this:
The hpstatic
function wraps the entire text in <helicone-prompt-static>
tags, indicating to Helicone that this part of the prompt should not be treated as variable input.
Change input name
To rename your input or have a custom input, change the key-value pair in the passed dictionary to the string formatter function:
Assign an id to your prompt
Assign a Helicone-Prompt-Id
header to your LLM request.
Assigning an id allows us to associate your prompt with future versions of your prompt, and automatically manage versions on your behalf.
Put it together
Let’s say we have an app that generates a short story, where users are able to input their own character
. For example, the prompt is “Write a story about a secret agent”, where the character
is “a secret agent”.
Run experiments
Once you’ve set up prompt management, you can use Helicone’s Experiments feature to test and improve your prompts.
Local testing
Many times in development, you may want to test your prompt locally before deploying it to production and you don’t want Helicone to track new prompt versions.
To do this, you can set the Helicone-Prompt-Mode
header to testing
in your LLM request. This will prevent Helicone from tracking new prompt versions.
FAQ
Was this page helpful?