Docs home page
Search...
⌘K
Ecosystem
Overview
LLM Frameworks
OpenAI Agents SDK
OpenAI SDK
Anthropic SDK
LangChain SDK
Vercel AI SDK
Mastra
Mem0
LlamaIndex
BAML
LLM providers
OpenAI
Anthropic
Azure OpenAI
Google Vertex AI
Gemini
AWS Bedrock
Nebius AI
Novita AI
Groq
AssemblyAI
Fireworks
Together AI
Perplexity AI
OpenRouter
AI dev tools
Linkup
Analytics
PostHog
Discord
Get started
Docs home page
Search...
⌘K
Discord
Get started
Get started
Search...
Navigation
Ecosystem
Overview
Documentation
API reference
Integrations
Cookbooks
Changelog ↗
Documentation
API reference
Integrations
Cookbooks
Changelog ↗
Ecosystem
Overview
Copy page
Use your own API keys through Keywords AI
This section is only for
Keywords AI LLM proxy
user.
Keywords AI provides a robust and flexible LLM proxy with 250+ LLMs. In this section, you will learn how to call those models and how to use Keywords AI with your current LLM frameworks.
Supported LLMs:
To know the full list of supported LLMs, you can check out our
public model library
.
Supported providers
OpenAI
Anthropic
Azure OpenAI
Google Vertex AI
AWS Bedrock
Groq
Fireworks
Together AI
Perplexity AI
OpenRouter
Cohere
Google Gemini AI
Mistral
Supported frameworks
Langchain
Vercel AI SDK
LlamaIndex
BAML
Mem0
PostHog
Was this page helpful?
Yes
No
Suggest edits
Raise issue
Python
Build multi-agent workflows with the OpenAI Agents SDK and Keywords AI.
Next
On this page
Supported LLMs:
Supported providers
Supported frameworks
Assistant
Responses are generated using AI and may contain mistakes.