Overview

Prompt logging gives you visibility into how your prompts perform in real-world applications. Track usage patterns, identify issues, and make data-driven improvements to your AI interactions.

Why monitor prompts?

  • Measure performance metrics: Track token usage, request volume, latency, and error rates to understand your prompt’s efficiency.
  • Compare version performance: Identify your best-performing prompt variants with side-by-side metric comparisons.
  • Analyze request distribution: See exactly how your LLM traffic is distributed across different prompts.

Quickstart

You need to first create a prompt in Keywords AI, and find the prompt ID in Prompts.

Although you might already defined the configuration for a prompt like model, temperature, etc, you should still pass those parameters in the payload.

You don’t need to pass things like token-related parameters, we’ll calculate them for you, but you need to pass time-related parameters like generation time, ttft, etc.

import requests

url = "https://api.keywordsai.co/api/request-logs/create/"
payload = {
    "model": "claude-3-5-sonnet-20240620",
    "completion_message": {
        "role": "assistant",
        "content": "Hi, how can I assist you today?"
    },
    "prompt": {
        "prompt_id": "xxxxxx", # prompt ID in UI
        "variables": {
        # You can pass variables in the prompt if you defined any variables in the UI
        },
    },
    "generation_time": 5.7,
    "ttft": 3.1,
}
headers = {
    "Authorization": "Bearer YOUR_KEYWORDS_AI_API_KEY",
    "Content-Type": "application/json"
}

response = requests.request("POST", url, headers=headers, json=payload)

Variables logging

When you make a request through the Logging API, you can send the prompt variables in the prompt_messages field. Just simply wrap your prompt variables in pairs of {{}}.

Learn how to use the Logging API here.

Example:

"prompt": {
    "prompt_id": "xxxxxx", // prompt ID in UI
    "variables": {"language": "Python","task_description": "Square a number", "specific_library": "math"}
}

External prompt logging

If you dont want to create any prompt in Keywords AI but you still want to log your prompts, you can pass your prompt ID in the prompt_id field, and set is_custom_prompt to true so the system knows it’s a custom prompt.

Example code

import requests

url = "https://api.keywordsai.co/api/request-logs/create/"
payload = {
    "model": "claude-3-5-sonnet-20240620",
    "prompt_messages": [
        {
            "role": "user",
            "content": "Hi"
        },
    ],
    "completion_message": {
        "role": "assistant",
        "content": "Hi, how can I assist you today?"
    },
    "prompt_id": "xxxxxx", # any prompt ID you want
    "is_custom_prompt": true,
    "generation_time": 5.7
}
headers = {
    "Authorization": "Bearer YOUR_KEYWORDS_AI_API_KEY",
    "Content-Type": "application/json"
}

response = requests.request("POST", url, headers=headers, json=payload)

You can then see the prompt metrics in Dashboard and Logs.