Vercel AI SDK
Learn how to integrate KeywordsAI tracing with Vercel AI SDK to monitor and analyze your AI application performance. Step-by-step guide for setting up environment variables and creating traced workflows.
Prerequisites
- Vercel AI SDK
- KeywordsAI API key
This tutorial shows how to set up Keywords AI tracing with Next.js and the Vercel AI SDK to monitor and trace your AI-powered applications.
Start with the pre-built example
We have built a pre-built example for you to get started quickly. You can find the example here.
Get up and running quickly with our pre-configured example:
Then:
- Add your API keys to
.env.local
(see Step 3 below) - Run
yarn dev
to start the development server - Start chatting and check your KeywordsAI dashboard
Start from scratch
You can also follow the Step-by-Step Tutorial to build your own application.
If you want to understand the setup process or add KeywordsAI tracing to an existing project, follow the tutorial below.
Execute create-next-app
with npm, Yarn, or pnpm to bootstrap the base example:
To run the base example locally you need to:
- Sign up at OpenAI’s Developer Platform.
- Go to OpenAI’s dashboard and create an API KEY.
- If you choose to use external files for attachments, then create a Vercel Blob Store.
- Set the required environment variable as the token value as shown the example env file but in a new file called
.env.local
- Run
pnpm install
to install the required dependencies. - Run
pnpm dev
to launch the development server.
KeywordsAI Telemetry Setup
Now let’s add KeywordsAI tracing to monitor your AI application’s performance and usage.
Step 1: Install KeywordsAI Exporter
Install the KeywordsAI exporter package:
Step 2: Set up OpenTelemetry Instrumentation
Next.js supports OpenTelemetry instrumentation out of the box. Following the Next.js OpenTelemetry guide, create an instrumentation.ts file in your project root:
Install vercel’s opentelemetry instrumentation:
Create instrumentation.ts in the root (where package.json lives) of your project
Step 3: Configure Environment Variables
Add your KeywordsAI credentials to your .env.local
file:
Step 4: Enable Telemetry in API Routes
In your API route files (e.g., app/api/chat/route.ts
), enable telemetry by adding the experimental_telemetry
option to your AI SDK functions:
Step 5: Test Your Setup
- Start your development server:
There might be broken dependencies in the @vercel/otel package, simply install them if you see them:
If this fails, make sure you are use the right version of package manager
And try again
-
Make some chat requests through your application
-
Check your KeywordsAI application:
- Go to Signals -> Traces
- Check the log that is traced
What Gets Traced
With this setup, KeywordsAI will automatically capture:
- AI Model Calls: All calls to OpenAI models through the AI SDK
- Request/Response Data: Input messages and generated responses
- Token Usage: Input and output token counts for cost tracking
- Performance Metrics: Latency and throughput data
- Error Tracking: Failed requests and error details
- Custom Metadata: Any additional context you want to track
Learn More
To learn more about the technologies used in this tutorial:
- AI SDK docs - comprehensive AI SDK documentation
- Next.js OpenTelemetry Guide - official Next.js telemetry documentation
- Vercel AI Playground - test AI models interactively
- OpenAI Documentation - learn about OpenAI features and API
- Next.js Documentation - learn about Next.js features and API