You can use the chat completion endpoint with Vercel AI SDK under two lines of code change.
Here’s an example of how to use the Vercel AI SDK with OpenAI:
import { createOpenAI, OpenAIProvider } from '@ai-sdk/openai'
import { streamText, streamObject } from 'ai'
async function main() {
// Initialize OpenAI with Keywords proxy
const client: OpenAIProvider = createOpenAI({
baseURL: 'https://api.keywordsai.co',
apiKey: 'YOUR_KEYWORDS_AI_API_KEY',
compatibility: 'strict',
})
const requestParamsDefault: Parameters<typeof streamText>[0] = {
model: client.chat('gpt-3.5-turbo'),
messages: [
{
role: 'user',
content: 'Hello! How are you doing today?',
},
],
temperature: 0.5,
}
try {
console.log('Calling OpenAI with Keywords proxy...')
const { textStream: proxyTextStream } = await streamText(requestParamsDefault)
for await (const textPart of proxyTextStream) {
console.log('Keywords Proxy Response:', textPart)
}
} catch (error) {
console.error('Error:', error)
}
}
main()
Here’s an example of how to use the Vercel AI SDK with OpenAI:
import { createOpenAI, OpenAIProvider } from '@ai-sdk/openai'
import { streamText, streamObject } from 'ai'
async function main() {
// Initialize OpenAI with Keywords proxy
const client: OpenAIProvider = createOpenAI({
baseURL: 'https://api.keywordsai.co',
apiKey: 'YOUR_KEYWORDS_AI_API_KEY',
compatibility: 'strict',
})
const requestParamsDefault: Parameters<typeof streamText>[0] = {
model: client.chat('gpt-3.5-turbo'),
messages: [
{
role: 'user',
content: 'Hello! How are you doing today?',
},
],
temperature: 0.5,
}
try {
console.log('Calling OpenAI with Keywords proxy...')
const { textStream: proxyTextStream } = await streamText(requestParamsDefault)
for await (const textPart of proxyTextStream) {
console.log('Keywords Proxy Response:', textPart)
}
} catch (error) {
console.error('Error:', error)
}
}
main()
Here’s an example of how to use the Vercel AI SDK with Anthropic:
import { createAnthropic } from '@ai-sdk/anthropic';
import { streamText, convertToCoreMessages } from 'ai';
// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
export const anthropic: AnthropicProvider = createAnthropic({
baseURL: "https://api.keywordsai.co/api/anthropic/v1",
apiKey: process.env.KEYWORDS_AI_API_KEY,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: anthropic('claude-3-5-sonnet-20240620'), // choose any model
messages: convertToCoreMessages(messages),
});
return result.toDataStreamResponse();
}
Here’s an example of how to use the Vercel AI SDK with Google Gemini:
import {
createGoogleGenerativeAI,
GoogleGenerativeAIProviderOptions,
} from "@ai-sdk/google";
import { generateText, streamText } from "ai";
const url = `https://api.keywordsai.co/api/google/`;
const google = createGoogleGenerativeAI({
// custom settings
baseURL: url,
apiKey: process.env.KEYWORDSAI_API_KEY,
});
const { text } = await generateText({
model: google("gemini-2.0-flash"),
providerOptions: {
google: {
responseModalities: ["TEXT", "IMAGE"],
} satisfies GoogleGenerativeAIProviderOptions,
},
prompt: "Hello, how are you?",
temperature: 0.5,
});
Add Keywords AI parameters
Adding Keywords AI parameters to the Vercel AI SDK is different than other frameworks. Here is an example of how to do it:
Specify Keywords AI params in an object
You should create an object with the Keywords AI parameters you want to use. Add parameters you want to use as keys in the object.
const keywordsAIHeaderContent = {
"customer_params": {
"customer_identifier": "customer_123",
"name": "Hendrix Liu", //optional
"email": "hendrix@keywordsai.co" //optional
}
// "cache_enabled": true or other parameters
}
Encode the object as a string
You should encode the object as a string and then you can send it as a header in your request.
const encoded = Buffer.from(JSON.stringify(keywordsAIHeaderContent)).toString('base64');
Add the header to your request
You should send it in the X-Data-Keywordsai-Params
header.
const client = createOpenAI({
baseURL: process.env.KEYWORDSAI_ENDPOINT_LOCAL,
apiKey: process.env.KEYWORDSAI_API_KEY_TEST,
compatibility: "strict",
headers: {
"X-Data-Keywordsai-Params": encoded
}
});
Full example
import { streamText, streamObject } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
const keywordsAIHeaderContent = {
"customer_identifier": "test_customer_identifier_from_header"
}
const encoded = Buffer.from(JSON.stringify(keywordsAIHeaderContent)).toString('base64');
const client = createOpenAI({
baseURL: process.env.KEYWORDSAI_ENDPOINT_LOCAL,
apiKey: process.env.KEYWORDSAI_API_KEY_TEST,
compatibility: "strict",
headers: {
"X-Data-Keywordsai-Params": encoded
}
});
const requestParamsDefault: Parameters<typeof streamText>[0] = {
model: client.chat("gpt-4o"),
messages: [
{
role: "user",
content: "Hello! How are you doing today?",
},
],
temperature: 0.5,
};
try {
console.log("Calling OpenAI with Keywords proxy...");
const { textStream: proxyTextStream } = await streamText(
requestParamsDefault
);
for await (const textPart of proxyTextStream) {
console.log("Keywords Proxy Response:", textPart);
}
} catch (error) {
console.error("Error:", error);
}