Traces is currently in beta. Please share any feedback or suggestions.

Create new account in Keywords AI

  1. Go to Keywords AI and create a new account.
  2. Get your API key from the API Keys page in Settings.

Set up Traces

1

Install the SDK

Install the package using your preferred package manager:

2

Set up Environment Variables

Get your API key from the API Keys page in Settings, then configure it in your environment:

.env
KEYWORDSAI_BASE_URL="https://api.keywordsai.co/api"
KEYWORDSAI_API_KEY="YOUR_KEYWORDSAI_API_KEY"
3

Annotate your workflows

Use the @workflow and @task decorators to instrument your code:

Python
from keywordsai_tracing.decorators import workflow, task
from keywordsai_tracing.main import KeywordsAITelemetry

k_tl = KeywordsAITelemetry()

@workflow(name="my_workflow")
def my_workflow():
    @task(name="my_task")
    def my_task():
        pass
    my_task()
4

A full example with LLM calls

In this example, you will see how to implement a workflow that includes LLM calls. We use OpenAI SDK as an example.

main.py
from openai import OpenAI
from keywordsai_tracing.decorators import workflow, task
from keywordsai_tracing.main import KeywordsAITelemetry

k_tl = KeywordsAITelemetry()
client = OpenAI()

@workflow(name="create_joke")
def create_joke():
    completion = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
        temperature=0.5,
        max_tokens=100,
        frequency_penalty=0.5,
        presence_penalty=0.5,
    )
    return completion.choices[0].message.content

Understanding Workflows and Tasks

A workflow is a sequence of tasks that you want to monitor. Each task represents a single unit of work within that workflow.

Example

Below is an example that demonstrates how to use tasks and workflows. We’ll create a workflow that:

  1. Generates a joke
  2. Translates it to pirate speak
  3. Adds a signature

Each operation is implemented as a separate task that can be used independently or as part of the workflow.

Python
@task(name="joke_creation")
def create_joke():
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
        temperature=0.5,
        max_tokens=100,
        frequency_penalty=0.5,
        presence_penalty=0.5,
        stop=["\n"],
        logprobs=True,
    )
    return completion.choices[0].message.content

@task(name="signature_generation")
def generate_signature(joke: str):
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[
            {"role": "user", "content": "add a signature to the joke:\n\n" + joke}
        ],
    )
    return completion.choices[0].message.content

@task(name="pirate_joke_translation")
def translate_joke_to_pirate(joke: str):
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[
            {
                "role": "user",
                "content": "translate the joke to pirate language:\n\n" + joke,
            }
        ],
    )
    return completion.choices[0].message.content

Chain these tasks together in a workflow:

Python
@workflow(name="joke_workflow")
def joke_workflow():
    joke = create_joke()
    pirate_joke = translate_joke_to_pirate(joke)
    signature = generate_signature(pirate_joke)
    return signature

You can then see the trace in the Traces.

Example adapted from Traceloop