Prompts
Iterating and versioning prompts as a team.
With our Prompts management feature, you can easily iterate and version your prompts as a team, deploy them to your codebase with one line of code, share them with your team, test them in the Playground, and more.
Check out the Prompts page to see the prompts you have created and deployed.
How to create a prompt
Create a prompt
Go to Prompts page, click on the+ New prompt
button, and fill in the required fields.
Write your prompt
You can write your prompt in the editor and use variables in your prompt by using double curly braces {{variable_name}}
.
Configure your prompt
You can configure your prompt by specify model, temperature, max_tokens, top_p, frequency_penalty, presence_penalty, and stream.
Test your prompt
After you have created your prompt, you can test it in the Playground to see the outputs. Click on the Playground
button to test your prompt.
Deploy your prompt to the production
After you have tested your prompt and are satisfied with the outputs, you can commit it and then deploy it to your codebase with one line of code.
Commit history
button and choose the previous read-only version to deploy it. Example
Optional parameter
with echo on, the response body will have an extra field
Turn on override to use params in override_params
instead of the params in the prompt.
You can put any OpenAI chat/completions parameters here to override the prompt’s parameters. This will only work if override
is set to true
.
Enable steam when you’re using OpenAI SDK
If you’re using OpenAI SDK and want to connect with the prompt you created, you have to specify stream=True
in the call body.