extra_body parameter if you’re using the OpenAI SDK.
env parameter in API calls. To switch between environments (test/production), use different API keys - one for your test environment and another for production. You can manage these keys in your API Keys settings.Properties
json_object, json_schema or textExample
loadbalance_models parameter.Example
none means the model will not call any tool and instead generates a message. auto means the model can pick between generating a message or calling one or more tools. required means the model must call one or more tools.none is the default when no tools are present. auto is the default if tools are present.Specifying a particular tool via the code below forces the model to call that tool.content of message.{ "type": "json_object" } enables JSON mode, which guarantees the message the model generates is valid JSON.If you want to specify your own output structure, use { "type": "json_schema", "json_schema": {...your shema}}. For more reference, please check OpenAI’s guide on structured outputYou must have a “json” as a keyword in the prompt to use this feature.Properties
json_object, json_schema and textVertex AI example (special case)
response_schema in the response_format parameter. Check the details of response schema here.model parameter Example
Example code with adding credentials
models field will overwrite the load_balance_group you specified in the UI.Example
Example
Example
Example
Example
cache_by_customer option, you can set it to true or false. If cache_by_customer is set to true, the cache will be stored by the customer identifier.usage.prompt_tokens_details.cached_tokens. When cache entries are created, usage.prompt_tokens_details.cache_creation_tokens may be present.usage.cache_read_input_tokens (tokens read from cache) and usage.cache_creation_input_tokens (tokens added to cache). Depending on the model/provider, you may see both the prompt_tokens_details and cache_* fields.0 or may be omitted by the provider.Properties
override_params instead of the params in the prompt.override is set to true.Example
Example
model parameter is used and the router will not trigger.When the model parameter is used, the router will not trigger, and this parameter behaves as fallback_models.model parameter takes precedence over this parameter, andfallback_models and safety net will still use the excluded models to catch failures.model parameter takes precedence over this parameter, andfallback_models and safety net will still use the excluded models to catch failures.Example
metadata parameter, because it’s indexed. You can see it in Logs with name Custom ID field.Example
Example
Properties
YYYY-MM-DD.YYYY-MM-DD.yearly, monthly, weekly, and dailyProperties
True means the user liked the output.model parameter. Example
usage object. This helps you reconcile token accounting across providers and caching scenarios.