Skip to main content

What is Vercel AI SDK tracing?

This guide shows how to set up Keywords AI tracing with Next.js and the Vercel AI SDK so you can monitor and trace your AI-powered applications.

Resources

Steps to use

If you already have a Next.js + Vercel AI SDK app, start from Step 1 below.

Optional: start with the pre-built example

npx create-next-app --example https://github.com/Keywords-AI/keywordsai-example-projects/tree/main/vercel_ai_next_openai my-keywordsai-app
Then:
  1. Add your API keys to .env.local (see Step 3)
  2. Run yarn dev (or pnpm dev) to start the dev server
  3. Start chatting and check your KeywordsAI dashboard
1

Install Keywords AI exporter

Install the Keywords AI exporter package:
npm install @keywordsai/exporter-vercel
2

Set up OpenTelemetry instrumentation

Next.js supports OpenTelemetry instrumentation out of the box. Create instrumentation.ts in your project root (where package.json lives).Install Vercel’s OpenTelemetry instrumentation:
yarn add @vercel/otel
Then configure the Keywords AI exporter:
instrumentation.ts
import { registerOTel } from "@vercel/otel";
import { KeywordsAIExporter } from "@keywordsai/exporter-vercel";

export function register() {
  registerOTel({
    serviceName: "next-app",
    traceExporter: new KeywordsAIExporter({
      apiKey: process.env.KEYWORDSAI_API_KEY,
      baseUrl: process.env.KEYWORDSAI_BASE_URL,
      debug: true,
    }),
  });
}
3

Configure environment variables

Add your KeywordsAI credentials (and your provider key) to .env.local:
.env.local
OPENAI_API_KEY=your_openai_api_key_here

KEYWORDSAI_API_KEY=your_keywordsai_api_key_here
KEYWORDSAI_BASE_URL=https://api.keywordsai.co
4

Enable telemetry in your route

In your API route file (e.g. app/api/chat/route.ts), enable telemetry by adding the experimental_telemetry option.
app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages, id } = await req.json();
  console.log("chat id", id);

  const createHeader = () => {
    return {
      "X-Data-Keywordsai-Params": Buffer.from(
        JSON.stringify({
          prompt_unit_price: 100000,
        })
      ).toString("base64"),
    };
  };

  const result = streamText({
    model: openai("gpt-4o"),
    messages,
    experimental_telemetry: {
      isEnabled: true,
      metadata: {
        customer_identifier: "customer_from_metadata",
        prompt_unit_price: 100000,
      },
      headers: createHeader(),
    },
  });

  return result.toDataStreamResponse();
}
5

Run locally and verify traces

  1. Start your dev server:
pnpm dev
  1. Make some chat requests through your application
  2. Verify traces in Keywords AI:
If you hit missing dependency errors from @vercel/otel, install the missing packages and retry.

What gets traced

With this setup, KeywordsAI will capture:
  • AI model calls: requests made via the Vercel AI SDK
  • Token usage: input and output token counts
  • Performance metrics: latency and throughput
  • Errors: failed requests and error details
  • Custom metadata: additional context you attach via telemetry metadata/headers