The most used feature of the LLM proxy is the ability to call 200+ LLMs with a single API format. You can switch between models with only 1 line of code change.

Choose a model

After you integrate the LLM proxy, you can choose a model from the Models page. In this page, you can see each model’s description, pricing, and other metrics, which helps you choose the best model for your use case.

Model family

You can also click an exact model to see it’s model family, which is a group of models that hosted by the different LLM providers.

Integration code

If you have already integrated the LLM proxy, you can click the Code button to copy the integration code with the language you are using.

Call models in different frameworks

from openai import OpenAI

client = OpenAI(
    base_url="https://api.keywordsai.co/api/",
    api_key="YOUR_KEYWORDSAI_API_KEY",
)

response = client.chat.completions.create(
    model="claude-3-5-haiku-20241022",
    messages=[
        {"role": "user", "content": "Tell me a long story"}
    ]
)