Keywords AI provides a robust and flexible LLM proxy with 250+ LLMs. You can use Keywords AI with OpenAI SDK by just passing the base_url
and api_key
to the OpenAI SDK.
With this integration, your LLM request will go through the Keywords AI gateway, and the requests will be automatically logged to Keywords AI.
Integration examples
from openai import OpenAI
client = OpenAI(
base_url="https://api.keywordsai.co/api/",
api_key=YOUR_KEYWORDSAI_API_KEY,
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role":"user", "content":"Tell me a long story"}],
)
from openai import OpenAI
client = OpenAI(
base_url="https://api.keywordsai.co/api/",
api_key=YOUR_KEYWORDSAI_API_KEY,
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role":"user", "content":"Tell me a long story"}],
)
import { OpenAI } from "openai";
const client = new OpenAI({
baseURL: "https://api.keywordsai.co/api",
apiKey: process.env.KEYWORDS_AI_API_KEY,
});
const response = await client.chat.completions
.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o-mini",
})
.asResponse();
console.log(await response.json());
package main
import (
"context"
"fmt"
"github.com/openai/openai-go"
"github.com/openai/openai-go/option"
)
func main() {
client := openai.NewClient(
options.WithBaseURL("https://api.keywordsai.co/api"),
option.WithAPIKey("KEYWORDSAI_API_KEY"), // defaults to os.LookupEnv("OPENAI_API_KEY")
)
chatCompletion, err := client.Chat.Completions.New(context.TODO(), openai.ChatCompletionNewParams{
Messages: openai.F([]openai.ChatCompletionMessageParamUnion{
openai.UserMessage("Say this is a test"),
}),
Model: openai.F(openai.ChatModelGPT4o),
})
if err != nil {
panic(err.Error())
}
println(chatCompletion.Choices[0].Message.Content)
}
Supported parameters
OpenAI parameters
We support all the OpenAI parameters. You can pass them directly in the request body.
You can pass Keywords AI parameters in the extra_body
parameter.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True,
)
You can pass Keywords AI parameters in the extra_body
parameter.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True,
)
For TypeScript, you should add a // @ts-expect-error
to the parameter.
const response = await client.chat.completions.create({
messages: [{ role: "user", content: "Tell me a story" }],
model: "gpt-4o-mini",
stream: true,
})
Keywords AI parameters
Keywords AI parameters can be passed differently depending on your programming language:
You can pass Keywords AI parameters in the extra_body
parameter.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
extra_body={"customer_identifier": "test_user_1"}
)
You can pass Keywords AI parameters in the extra_body
parameter.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
extra_body={"customer_identifier": "test_user_1"}
)
For TypeScript, you should add a // @ts-expect-error
to the parameter.
const response = await client.chat.completions.create({
messages: [{ role: "user", content: "Tell me a story" }],
model: "gpt-4o-mini",
// @ts-expect-error
customer_identifier: "test_user_1",
})
These parameters will take precedence over the OpenAI parameters if they are conflicting.
Call Azure OpenAI with OpenAI SDK
To call Azure OpenAI models, instead of using azure OpenAI’s client, the easier way is to use the OpenAI client.
from openai import AsyncOpenAI
azureaclient = AsyncOpenAI(
api_key=KEYWORDSAI_API_KEY,
base_url="https://api.keywordsai.co/api"
)
response = await azureaclient.chat.completions.create(**kwargs)
Then You can upload your Azure credentials to Keywords AI platform to use those models. Check out our Providers integration.