Conversation
- Add OllamaProvider class with OpenAI-compatible API support - Register Ollama in ProviderRegistry with model selection rules - Add Ollama configuration to AppConfig (baseUrl, apiKey, enabled) - Add Ollama to chat_completions_providers_v1.json catalog with 16 popular models - Add ollama.yaml pricing file (free/local models) - Update ProviderName type to include 'ollama' - Add OLLAMA_BASE_URL and OLLAMA_API_KEY to .env.example Ollama runs models locally and exposes an OpenAI-compatible API at http://localhost:11434/v1 by default. Users can configure a custom base URL via OLLAMA_BASE_URL environment variable. Co-authored-by: Cursor <cursoragent@cursor.com>
sm86
left a comment
There was a problem hiding this comment.
overall, looking good. Appreciate the PR
| "codellama", | ||
| "starcoder2" | ||
| ], | ||
| "chat_completions": { |
There was a problem hiding this comment.
can we prioritize support for Responses API. /chat/completions might soon be deprecated too.
There was a problem hiding this comment.
There was a problem hiding this comment.
Yes, added full support for the Responses API (/v1/responses) for Ollama in this PR. It implements the Open Responses spec as requested.
- Added Ollama to responses_providers_v1.json catalog - Created OllamaResponsesPassthrough class implementing Responses API - Registered Ollama in responses-passthrough-registry.ts Ollama supports the OpenResponses API specification at /v1/responses endpoint, providing future-proof support as /chat/completions may be deprecated. Co-authored-by: Cursor <cursoragent@cursor.com>
0405b00 to
46afa66
Compare
Co-authored-by: Cursor <cursoragent@cursor.com>
46afa66 to
9e00d32
Compare
|
congratulations on your first PR. If you would like to contribute, I am looking to add support for OAuth on OpenAI and claude to use their pre-existing subscriptions |
|
Thanks! Really happy to see this merged. I'd be interested in helping with the OAuth support for OpenAI and Claude. I'll take a look at how we can implement that and open a new PR when I have something ready. |
|
I’ve opened a PR for OAuth support: https://github.com/ekailabs/ekai-gateway/pull/72 . It adds authorize/callback/status routes and wires OAuth tokens into the OpenAI and Anthropic passthroughs so users can use their subscriptions. Happy to adjust based on your review. |
…ider feat: Add Ollama provider support
Ollama runs models locally and exposes an OpenAI-compatible API at http://localhost:11434/v1 by default. Users can configure a custom base URL via OLLAMA_BASE_URL environment variable.