Start with the pre-built example
We have built a pre-built example for you to get started quickly. You can find the example here. Get up and running quickly with our pre-configured example:- Add your API keys to
.env.local
(see Step 3 below) - Run
yarn dev
to start the development server - Start chatting and check your KeywordsAI dashboard
Start from scratch
You can also follow the Step-by-Step Tutorial to build your own application. If you want to understand the setup process or add KeywordsAI tracing to an existing project, follow the tutorial below. Executecreate-next-app
with npm, Yarn, or pnpm to bootstrap the base example:
- Sign up at OpenAI’s Developer Platform.
- Go to OpenAI’s dashboard and create an API KEY.
- If you choose to use external files for attachments, then create a Vercel Blob Store.
- Set the required environment variable as the token value as shown the example env file but in a new file called
.env.local
- Run
pnpm install
to install the required dependencies. - Run
pnpm dev
to launch the development server.
KeywordsAI Telemetry Setup
Now let’s add KeywordsAI tracing to monitor your AI application’s performance and usage.Step 1: Install KeywordsAI Exporter
Install the KeywordsAI exporter package:Step 2: Set up OpenTelemetry Instrumentation
Next.js supports OpenTelemetry instrumentation out of the box. Following the Next.js OpenTelemetry guide, create an instrumentation.ts file in your project root: Install vercel’s opentelemetry instrumentation:instrumentation.ts
Step 3: Configure Environment Variables
Add your KeywordsAI credentials to your.env.local
file:
.env.local
Step 4: Enable Telemetry in API Routes
In your API route files (e.g.,app/api/chat/route.ts
), enable telemetry by adding the experimental_telemetry
option to your AI SDK functions:
app/api/chat/route.ts
Step 5: Test Your Setup
- Start your development server:
package.json
- Make some chat requests through your application
- Check your KeywordsAI application:
- Go to Signals -> Traces
- Check the log that is traced
What Gets Traced
With this setup, KeywordsAI will automatically capture:- AI Model Calls: All calls to OpenAI models through the AI SDK
- Request/Response Data: Input messages and generated responses
- Token Usage: Input and output token counts for cost tracking
- Performance Metrics: Latency and throughput data
- Error Tracking: Failed requests and error details
- Custom Metadata: Any additional context you want to track
Learn More
To learn more about the technologies used in this tutorial:- AI SDK docs - comprehensive AI SDK documentation
- Next.js OpenTelemetry Guide - official Next.js telemetry documentation
- Vercel AI Playground - test AI models interactively
- OpenAI Documentation - learn about OpenAI features and API
- Next.js Documentation - learn about Next.js features and API