Prerequisites

Steps

1

Install Keywords AI SDK

npm install @keywordsai/tracing
2

Add an instrumentation file to the root of your project

You should add an instrumentation file to the root of your project. For example, instrumentation.ts. Same level as .env or next.config.js file.

Then add the KeywordsAIExporter as the trace exporter

root-of-project/instrumentation.js
import { KeywordsAIExporter } from "@keywordsai/tracing";
import { registerOTel } from "@vercel/otel";
// This file is a special Next.js file that is loaded on startup
export async function register() {
  try {
    // Only load the instrumentation in the Node.js environment (not in edge runtime or client)
    registerOTel({
      serviceName: "keywords-ai-example",
      traceExporter: new KeywordsAIExporter({
        //   baseUrl: "https://api.keywords.co", # This will be the default base URL. enterprise customer should change it
        apiKey: process.env.KEYWORDSAI_API_KEY_TEST,
      }),
    });
  } catch (error) {
    console.error("Failed to load instrumentation:", error);
  }
}
3

(optional) Configure next.config.js

If you are using Next.js, please do the following:

  1. Add node-loader package
npm install node-loader
  1. Add the following to your next.config.js file:
next.config.js
const nextConfig = {
webpack: (config, { isServer }) => {
  config.module.rules.push({
    test: /\.node$/,
    loader: "node-loader",
  });
  if (isServer) {
    config.ignoreWarnings = [{ module: /opentelemetry/ }];
  }
  // In this block, we will ignore/browserify some node dependencies
     config.resolve.alias = {
      ...config.resolve.alias,
      zlib: require.resolve("browserify-zlib"),
    };
    config.resolve.fallback = {
      ...config.resolve.fallback,
      tls: false,
      fs: false,
      http: false,
      https: false,
      http2: false,
      net: false,
      dns: false,
      os: false,
      path: false,
      stream: false,
    };
  // end of ignoring/browserify node depenencies
  return config;
},
};
4

Send traces to KeywordsAI

Then, to trace a specific function, simply enable tracing. here is an example of it working on the streamText function:

import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";

const result = await streamText({
  model: openai("gpt-4o"),
  messages: convertToCoreMessages(messages),
  // Add this block here
  experimental_telemetry: {
    isEnabled: true,
  }
  // End of add this block
});