Skip to main content
This section is for Keywords AI LLM gateway users.
Use Keywords AI Gateway to call Groq models while keeping unified observability (logs, cost, latency, and reliability metrics) in Keywords AI — and optionally charge usage to your own Groq credits.

Prerequisites

  • A Keywords AI API key
  • A Groq API key (BYOK credits)

Supported SDKs / integrations

Configuration

There are 2 ways to add your Groq credentials to your requests:

Via UI (Global)

1

Navigate to Providers

Go to the Providers page. This page allows you to manage credentials for over 20+ supported providers.
Keywords AI Providers Page
2

Add your Groq API Key

Select Groq and paste your API key.
Add Groq Credentials
3

Configure Load Balancing (Optional)

You can add multiple Groq API keys for redundancy. Use the Load balancing weight field to determine how traffic is distributed between keys.

Via code (Per-Request)

You can pass credentials dynamically in the request body. This is useful if you need to use your users’ own API keys (BYOK credits). Add the customer_credentials parameter to your Gateway request:
{
  // Rest of the request body
  "customer_credentials": {
    "groq": {
      "api_key": "YOUR_GROQ_API_KEY"
    }
  }
}

Override credentials for a particular model (Optional)

If you uploaded provider credentials in the UI, you can still override credentials for specific models on a per-request basis.
{
  // Rest of the request body
  "customer_credentials": {
    "groq": {
      "api_key": "YOUR_GROQ_API_KEY"
    }
  },
  "credential_override": {
    "groq/llama-3.1-8b-versatile": {
      "api_key": "ANOTHER_GROQ_API_KEY"
    }
  }
}

Log Groq requests

If you are not using the Gateway to proxy requests, you can still log your Groq requests to Keywords AI asynchronously. This allows you to track cost, latency, and performance metrics for external calls.
Groq Python SDK
import requests

url = "https://api.keywordsai.co/api/request-logs/create/"
payload = {
    "model": "llama3-8b-8192",
    "prompt_messages": [
        {
            "role": "user",
            "content": "Write a short poem about AI"
        }
    ],
    "completion_message": {
        "role": "assistant",
        "content": "In circuits bright and data streams, AI awakens from digital dreams..."
    },
    "cost": 0.0001,
    "generation_time": 0.8,
    "customer_params": {
        "customer_identifier": "user_101"
    }
}
headers = {
    "Authorization": "Bearer YOUR_KEYWORDS_AI_API_KEY",
    "Content-Type": "application/json"
}

response = requests.post(url, headers=headers, json=payload)

Get Started with Logging

Learn how to set up comprehensive logging for all your LLM requests