feat: detect credentials in interactive model setup menu#273
Closed
feat: detect credentials in interactive model setup menu#273
Conversation
…e sale UI. needs frontend pr too.
The frontend catch-all `/` HTTPRoute had no hostname restriction, meaning the entire UI (dashboard, sell modal, settings) was publicly accessible through the Cloudflare tunnel. Add `hostnames: ["obol.stack"]` to match the eRPC route pattern already in this branch. Also add CLAUDE.md guardrails documenting the local-only vs public route split and explicit NEVER rules to prevent future regressions.
Four fixes for the sell-inference cluster routing introduced in #267: 1. Security: bind gateway to 127.0.0.1 when NoPaymentGate=true so only cluster traffic (via K8s Service+Endpoints bridge) can reach the unpaid listener — no host/LAN exposure. 2. Critical: use parsed --listen port in Service, Endpoints, and ServiceOffer spec instead of hardcoded 8402. Non-default ports now work correctly. 3. k3s support: resolveHostIP() now checks DetectExistingBackend() for k3s and returns 127.0.0.1, matching the existing ollamaHostIPForBackend() strategy in internal/stack. 4. Migration: keep "obol-agent" as default instance ID to preserve existing openclaw-obol-agent namespaces on upgrade. Avoids orphaned deployments when upgrading from pre-#267 installs. Also bumps frontend to v0.1.13-rc.1.
When ~/.openclaw/openclaw.json specifies a cloud model as the agent's primary (e.g. anthropic/claude-sonnet-4-6), autoConfigureLLM() now detects the provider and API key from the environment (or .env in dev mode) and configures LiteLLM before OpenClaw setup runs. This makes agent chat work out of the box without a separate `obol model setup`. Changes: - internal/stack: add autoConfigureCloudProviders() with env + .env key resolution (dev-mode only for .env) - internal/model: export ProviderFromModelName(), ProviderEnvVar(); add HasProviderConfigured(), LoadDotEnv() - cmd/obol/model: update defaults — claude-sonnet-4-6, gpt-4.1 - internal/model: update WellKnownModels with current flagship models (claude-opus-4-6, gpt-5.4, gpt-4.1, o4-mini) - obolup.sh: add check_agent_model_api_key() to warn users before cluster start if a required API key is missing
Split ConfigureLiteLLM into PatchLiteLLMProvider (config-only) and RestartLiteLLM (restart+wait). autoConfigureLLM now patches Ollama and cloud providers first, then does one restart — halving startup time when both are configured.
Instead of printing a warning that users miss, prompt for the API key during setup when a cloud model is detected in ~/.openclaw config. The key is exported so the subsequent obol bootstrap → stack up → autoConfigureLLM picks it up automatically. Falls back to a warning in non-interactive mode. Inspired by hermes-agent's interactive setup wizard pattern.
When `obol model setup` runs without --provider, the interactive menu now checks the environment for existing API keys and Ollama availability, showing detection badges next to each provider. If the user picks a provider with a detected credential, they are offered the option to reuse it instead of being prompted for a new key. Detected sources: - Anthropic: ANTHROPIC_API_KEY, CLAUDE_CODE_OAUTH_TOKEN - OpenAI: OPENAI_API_KEY - Ollama: reachable with N model(s) available The flag-based path (--provider, --api-key) is unchanged. Closes #272 (Unit 2)
OisinKyne
approved these changes
Mar 17, 2026
Merged
5 tasks
Collaborator
Author
4 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
obol model setupruns without--provider, the interactive menu now detects existing API keys and Ollama availability, showing badges next to each provider (e.g., "Anthropic (anthropic) — detected: ANTHROPIC_API_KEY")ANTHROPIC_API_KEY,CLAUDE_CODE_OAUTH_TOKEN(Anthropic),OPENAI_API_KEY(OpenAI), Ollama reachability + model countTest plan
go build ./cmd/obol/compiles cleanlygo vet ./cmd/obol/passesgo test ./cmd/obol/ ./internal/model/passesANTHROPIC_API_KEY=test-key obol model setup— verify Anthropic shows detection badge and prompts to reuseobol model setupwithout env vars — verify no badges, normal key promptobol model setup --provider anthropic --api-key sk-ant-xxx— verify flag-based path unchangedCloses #272 (Unit 2)