Supported models
Call 200+ LLMs with a single OpenAI compatible format
The most used feature of the LLM proxy is the ability to call 200+ LLMs with a single API format. You can switch between models with only 1 line of code change.
Choose a model
After you integrate the LLM proxy, you can choose a model from the Models page. In this page, you can see each model’s description, pricing, and other metrics, which helps you choose the best model for your use case.
Model family
You can also click an exact model to see it’s model family, which is a group of models that hosted by the different LLM providers.
Integration code
If you have already integrated the LLM proxy, you can click the Code
button to copy the integration code with the language you are using.
Call models in different frameworks
Here is an example of how to disable logging in the OpenAI TypeScript SDK. In OpenAI TypeScript SDK, you should add a // @ts-expect-error
before the disable_log field.
We also support adding credentials in other SDKs or languages, please check out our integration section for more information.
Was this page helpful?