How to log prompt variables

Via LLM proxy

If you are using the LLM proxy, you should first set up your prompts in the code. Then we will log the LLM requests and variables automatically.

Example: When I defined a prompt in Keywords AI like this:

Then deploy your prompt in the code:

"model": "gpt-4o-mini",
"prompt": {
    "prompt_id": "09IqVI-test",
    "variables": {"language": "Python","task_description": "Square a number", "specific_library": "math"},
    "version": 15
}

Once you send a request through the LLM proxy, you will see the prompt variables in the side panel.

Via Logging API

When you make a request through the Logging API, you can send the prompt variables in the prompt_messages field. Just simply wrap your prompt variables in pairs of {{}}.

Learn how to use the Logging API here.

Example:

"prompt_messages": [
    {
    "content": "Please develop an optimized Python function to {{task_description}}, utilizing {{specific_library}}, include error handling, and write unit tests for the function.",
    "role": "user"
    }
],
"variables": {
    "task_description": "Square a number",
    "specific_library": "math"
}