Logs
Log prompt variables
You can send your prompt variables to Keywords AI to log them in your logs. So you can see the prompt template and variables separately
How to log prompt variables
Via LLM proxy
If you are using the LLM proxy, you should first set up your prompts in the code. Then we will log the LLM requests and variables automatically.
Example: When I defined a prompt in Keywords AI like this:
Then deploy your prompt in the code:
Once you send a request through the LLM proxy, you will see the prompt variables in the side panel.
Via Logging API
When you make a request through the Logging API, you can send the prompt variables in the prompt_messages
field. Just simply wrap your prompt variables in pairs of {{}}
.
Learn how to use the Logging API here.
Example: