Skip to content

feat: Add Ollama provider support#71

Merged
sm86 merged 3 commits intoekailabs:mainfrom
Shriiii01:feature/add-ollama-provider
Feb 10, 2026
Merged

feat: Add Ollama provider support#71
sm86 merged 3 commits intoekailabs:mainfrom
Shriiii01:feature/add-ollama-provider

Conversation

@Shriiii01
Copy link
Copy Markdown
Contributor

  • Added OllamaProvider class with OpenAI-compatible API support
  • Registered Ollama in ProviderRegistry with model selection rules
  • Added Ollama configuration to AppConfig (baseUrl, apiKey, enabled)
  • Added Ollama to chat_completions_providers_v1.json catalog with 16 popular models
  • Added ollama.yaml pricing file (free/local models)
  • Updated ProviderName type to include 'ollama'
  • Added OLLAMA_BASE_URL and OLLAMA_API_KEY to .env.example

Ollama runs models locally and exposes an OpenAI-compatible API at http://localhost:11434/v1 by default. Users can configure a custom base URL via OLLAMA_BASE_URL environment variable.

- Add OllamaProvider class with OpenAI-compatible API support
- Register Ollama in ProviderRegistry with model selection rules
- Add Ollama configuration to AppConfig (baseUrl, apiKey, enabled)
- Add Ollama to chat_completions_providers_v1.json catalog with 16 popular models
- Add ollama.yaml pricing file (free/local models)
- Update ProviderName type to include 'ollama'
- Add OLLAMA_BASE_URL and OLLAMA_API_KEY to .env.example

Ollama runs models locally and exposes an OpenAI-compatible API at
http://localhost:11434/v1 by default. Users can configure a custom
base URL via OLLAMA_BASE_URL environment variable.

Co-authored-by: Cursor <cursoragent@cursor.com>
Copy link
Copy Markdown
Contributor

@sm86 sm86 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall, looking good. Appreciate the PR

"codellama",
"starcoder2"
],
"chat_completions": {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we prioritize support for Responses API. /chat/completions might soon be deprecated too.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, added full support for the Responses API (/v1/responses) for Ollama in this PR. It implements the Open Responses spec as requested.

- Added Ollama to responses_providers_v1.json catalog
- Created OllamaResponsesPassthrough class implementing Responses API
- Registered Ollama in responses-passthrough-registry.ts

Ollama supports the OpenResponses API specification at /v1/responses endpoint,
providing future-proof support as /chat/completions may be deprecated.

Co-authored-by: Cursor <cursoragent@cursor.com>
@Shriiii01 Shriiii01 force-pushed the feature/add-ollama-provider branch from 0405b00 to 46afa66 Compare February 9, 2026 17:00
Co-authored-by: Cursor <cursoragent@cursor.com>
@Shriiii01 Shriiii01 force-pushed the feature/add-ollama-provider branch from 46afa66 to 9e00d32 Compare February 9, 2026 17:02
@sm86 sm86 merged commit 6e0df22 into ekailabs:main Feb 10, 2026
1 check passed
@sm86
Copy link
Copy Markdown
Contributor

sm86 commented Feb 10, 2026

congratulations on your first PR.

If you would like to contribute, I am looking to add support for OAuth on OpenAI and claude to use their pre-existing subscriptions

@Shriiii01
Copy link
Copy Markdown
Contributor Author

Thanks! Really happy to see this merged.

I'd be interested in helping with the OAuth support for OpenAI and Claude. I'll take a look at how we can implement that and open a new PR when I have something ready.

@Shriiii01 Shriiii01 deleted the feature/add-ollama-provider branch February 10, 2026 18:07
@Shriiii01
Copy link
Copy Markdown
Contributor Author

Shriiii01 commented Feb 10, 2026

I’ve opened a PR for OAuth support: https://github.com/ekailabs/ekai-gateway/pull/72 . It adds authorize/callback/status routes and wires OAuth tokens into the OpenAI and Anthropic passthroughs so users can use their subscriptions. Happy to adjust based on your review.

Shriiii01 pushed a commit to Shriiii01/ekai-gateway that referenced this pull request Mar 7, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants