via Tracing
You can use Keywords AI Traces to trace your LLM requests and responses.
What is Tracing?
LLM tracing is a feature that allows you to trace your LLM requests and responses. It is a way to track the workflow of LLM calls and tools of your AI application.
How to set up Tracing?
Install the SDK
Install the package using your preferred package manager:
Set up Environment Variables
Get your API key from the API Keys page in Settings, then configure it in your environment:
Annotate your workflows
Use the @workflow
and @task
decorators to instrument your code:
A full example with LLM calls
In this example, you will see how to implement a workflow that includes LLM calls. We use OpenAI SDK as an example.
Install the SDK
Install the package using your preferred package manager:
Set up Environment Variables
Get your API key from the API Keys page in Settings, then configure it in your environment:
Annotate your workflows
Use the @workflow
and @task
decorators to instrument your code:
A full example with LLM calls
In this example, you will see how to implement a workflow that includes LLM calls. We use OpenAI SDK as an example.
Install the SDK
Install the package using your preferred package manager:
Set up Environment Variables
Get your API key from the API Keys page in Settings, then configure it in your environment:
Create a simple task
Create a workflow combining tasks
In this example, we create a workflow pirate_joke_workflow
that combines the createJoke
task with a translateJoke
task.
You can now see your traces in the Traces and go to Logs to see the details of your LLM calls.
Was this page helpful?