Base URL
+https://api.litellm.ai
+ No authentication required. All endpoints are publicly accessible.
++ GET + /model_catalog +
+List and filter models. Returns pricing, context windows, and capability metadata.
+ +Query Parameters
+| Parameter | +Type | +Description | +
|---|---|---|
provider | string | Filter by provider (e.g. openai, anthropic, vertex_ai) |
model | string | Search by model name (partial match) |
mode | string | Filter by mode (e.g. chat, embedding, completion) |
supports_vision | boolean | Filter models that support vision |
supports_function_calling | boolean | Filter models that support function calling |
supports_tool_choice | boolean | Filter models that support tool choice |
supports_response_schema | boolean | Filter models that support response schema |
limit | integer | Number of results to return (default: 50) |
offset | integer | Offset for pagination |
Example
+curl 'https://api.litellm.ai/model_catalog?provider=openai&mode=chat&limit=2'
+ {JSON.stringify({
+ data: [
+ {
+ model: "gpt-4o",
+ provider: "openai",
+ mode: "chat",
+ max_input_tokens: 128000,
+ max_output_tokens: 16384,
+ input_cost_per_token: 0.0000025,
+ output_cost_per_token: 0.00001,
+ supports_vision: true,
+ supports_function_calling: true
+ },
+ {
+ model: "gpt-4o-mini",
+ provider: "openai",
+ mode: "chat",
+ max_input_tokens: 128000,
+ max_output_tokens: 16384,
+ input_cost_per_token: 1.5e-7,
+ output_cost_per_token: 6e-7,
+ supports_vision: true,
+ supports_function_calling: true
+ }
+ ],
+ total: 150,
+ limit: 2,
+ offset: 0
+}, null, 2)}
+ + GET + /model_catalog/{'{model_id}'} +
+Get detailed information for a single model by its ID.
+ +Example
+curl 'https://api.litellm.ai/model_catalog/gpt-4o'
+ {JSON.stringify({
+ model: "gpt-4o",
+ provider: "openai",
+ mode: "chat",
+ max_input_tokens: 128000,
+ max_output_tokens: 16384,
+ input_cost_per_token: 0.0000025,
+ output_cost_per_token: 0.00001,
+ cache_read_input_token_cost: 0.00000125,
+ supports_vision: true,
+ supports_function_calling: true,
+ supports_tool_choice: true,
+ supports_response_schema: true
+}, null, 2)}
+ More Examples
+ +Filter by vision support
+GET /model_catalog?supports_vision=true&limit=10
+ Get all Anthropic models
+GET /model_catalog?provider=anthropic
+ Search by model name
+GET /model_catalog?model=claude
+ Embedding models only
+GET /model_catalog?mode=embedding
+ + For the full interactive API documentation with try-it-out functionality, visit + api.litellm.ai/docs. +
+