Skip to main content
This integration is for the Keywords AI gateway.
This is only a pass-through integration. So with this integration, some features are not available.
Default max_tokens: Anthropic requests sent through Keywords AI have a default of 4096 max_tokens. Make sure to explicitly set max_tokens in your requests if you need a different value.

Pros and Cons

Pros

  • Easy Setup: Only 2 lines of code to integrate.
  • Use Claude’s Thinking Feature: You can use the Thinking feature from Claude 3.7 Sonnet or Claude 4 models.

Cons

Integration examples

import anthropic

client = anthropic.Anthropic(
    base_url="https://api.keywordsai.co/api/anthropic/",
    api_key="Your_Keywords_AI_API_Key",
)

message = client.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=1000,
    system="Respond only in Yoda-speak.",
    messages=[
        {"role": "user", "content": "How are you today?"}
    ],
    metadata={
        "keywordsai_params": {
            "customer_identifier": "something" # You need to wrap the customer_identifier into the "keywordsai_params" key
        }
    },
)

print(message.content)

Extra Headers

You can pass additional headers to be sent with your LLM requests using the extra_headers parameter. This is useful for sending custom headers required by specific models or configurations.
import anthropic

client = anthropic.Anthropic(
    base_url="https://api.keywordsai.co/api/anthropic/",
    api_key="Your_Keywords_AI_API_Key",
)

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1000,
    system="Respond only in Yoda-speak.",
    messages=[
        {"role": "user", "content": "Are there an infinite number of prime numbers such that n mod 4 == 3?"}
    ],
    metadata={
        "extra_headers": {
            "anthropic-beta": "context-1m-2025-08-07"
        }
    },
)

print(message.content)
Extra Headers: The extra_headers parameter allows you to pass additional headers that will be sent with your LLM request. This is particularly useful for accessing beta features like Claude Sonnet 4’s 1M token context window using the anthropic-beta: context-1m-2025-08-07 header.

Keywords AI parameters

To use Keywords AI parameters, you can pass them in the metadata parameter. In the above example, customer_identifier is a Keywords AI parameter. These parameters will take precedence over the Anthropic parameters if they are conflicting.

Nested Metadata for KeywordsAI Parameters

You can pass multiple KeywordsAI-specific parameters using a nested metadata structure. This allows you to include custom_identifier, customer_identifier, and additional custom metadata all in one structure:
import anthropic

client = anthropic.Anthropic(
    base_url="https://api.keywordsai.co/api/anthropic/",
    api_key="Your_Keywords_AI_API_Key",
)

message = client.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=1000,
    system="Respond only in Yoda-speak.",
    messages=[
        {"role": "user", "content": "How are you today?"}
    ],
    metadata={
        "keywordsai_params": {
            "custom_identifier": "session_id",
            "customer_identifier": "customer_id",
            "metadata": {
                "agent": "orchestrator"
            }
        }
    },
)

print(message.content)
Nested Metadata Structure: The metadata object should contain a keywordsai_params key, which itself contains:
  • custom_identifier: A custom identifier for tracking specific requests
  • customer_identifier: An identifier for the customer making the request
  • metadata: An additional nested object for custom metadata fields (e.g., agent type, workflow stage, etc.)