What is Vercel AI SDK tracing?
This guide shows how to set up Keywords AI tracing with Next.js and the Vercel AI SDK so you can monitor and trace your AI-powered applications.Resources
Steps to use
If you already have a Next.js + Vercel AI SDK app, start from Step 1 below.Optional: start with the pre-built example
- Add your API keys to
.env.local(see Step 3) - Run
yarn dev(orpnpm dev) to start the dev server - Start chatting and check your KeywordsAI dashboard
1
Install Keywords AI exporter
Install the Keywords AI exporter package:
2
Set up OpenTelemetry instrumentation
Next.js supports OpenTelemetry instrumentation out of the box. Create Then configure the Keywords AI exporter:
instrumentation.ts in your project root (where package.json lives).Install Vercel’s OpenTelemetry instrumentation:instrumentation.ts
3
Configure environment variables
Add your KeywordsAI credentials (and your provider key) to
.env.local:- OpenAI
- Anthropic
- Google Gemini
.env.local
4
Enable telemetry in your route
In your API route file (e.g.
app/api/chat/route.ts), enable telemetry by adding the experimental_telemetry option.- OpenAI
- Anthropic
- Google Gemini
app/api/chat/route.ts
5
Run locally and verify traces
- Start your dev server:
- Make some chat requests through your application
- Verify traces in Keywords AI:
- Go to Logs → Traces
- Confirm requests are being traced
@vercel/otel, install the missing packages and retry.What gets traced
With this setup, KeywordsAI will capture:- AI model calls: requests made via the Vercel AI SDK
- Token usage: input and output token counts
- Performance metrics: latency and throughput
- Errors: failed requests and error details
- Custom metadata: additional context you attach via telemetry metadata/headers