Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,8 @@ k3d: 1 server, ports 80:80 + 8080:80 + 443:443 + 8443:443, `rancher/k3s:v1.35.1-

**LiteLLM gateway** (`llm` ns, port 4000): OpenAI-compatible proxy routing to Ollama/Anthropic/OpenAI. ConfigMap `litellm-config` (YAML config.yaml with model_list), Secret `litellm-secrets` (master key + API keys). Auto-configured with Ollama models during `obol stack up` (no manual `obol model setup` needed). `ConfigureLiteLLM()` patches config + Secret + restarts. Custom endpoints: `obol model setup custom --name --endpoint --model` (validates before adding). Paid remote inference stays on vanilla LiteLLM with a static route `paid/* -> openai/* -> http://127.0.0.1:8402`; no LiteLLM fork is required. OpenClaw always routes through LiteLLM (openai provider slot), never native providers; `dangerouslyDisableDeviceAuth` is enabled for Traefik-proxied access.

**Auto-configuration**: During `obol stack up`, `autoConfigureLLM()` detects host Ollama models and patches LiteLLM config so agent chat works immediately without manual `obol model setup`. During install, `obolup.sh` `check_agent_model_api_key()` reads `~/.openclaw/openclaw.json` agent model, resolves API key from environment (`ANTHROPIC_API_KEY`, `CLAUDE_CODE_OAUTH_TOKEN` for Anthropic; `OPENAI_API_KEY` for OpenAI), and exports it for downstream tools.

**Per-instance overlay**: `buildLiteLLMRoutedOverlay()` reuses "ollama" provider slot pointing at `litellm.llm.svc:4000/v1` with `api: openai-completions`. App → litellm:4000 → routes by model name → actual API.

## Standalone Inference Gateway
Expand Down Expand Up @@ -148,6 +150,15 @@ Skills = SKILL.md + optional scripts/references, embedded in `obol` binary (`int
4. **`OBOL_DEVELOPMENT=true`** — required for `obol stack up` to auto-build local images (x402-verifier, x402-buyer)
5. **Root-owned PVCs** — `-f` flag required to remove in `obol stack purge`

### OpenClaw Version Management

Three places pin the OpenClaw version — all must agree:
1. `internal/openclaw/OPENCLAW_VERSION` — source of truth (Renovate watches, CI reads)
2. `internal/openclaw/openclaw.go` — `openclawImageTag` constant
3. `obolup.sh` — `OPENCLAW_VERSION` shell constant for standalone installs

`TestOpenClawVersionConsistency` in `internal/openclaw/version_test.go` catches drift.

### Pitfalls

1. **Kubeconfig port drift** — k3d API port can change between restarts. Fix: `k3d kubeconfig write <name> -o .workspace/config/kubeconfig.yaml --overwrite`
Expand Down
49 changes: 48 additions & 1 deletion cmd/obol/model.go
Original file line number Diff line number Diff line change
Expand Up @@ -60,17 +60,30 @@ func modelSetupCommand(cfg *config.Config) *cli.Command {

// Interactive mode if flags not provided
if provider == "" {
creds := detectCredentials()
providers, _ := model.GetAvailableProviders(cfg)
options := make([]string, len(providers))
for i, p := range providers {
options[i] = fmt.Sprintf("%s (%s)", p.Name, p.ID)
label := fmt.Sprintf("%s (%s)", p.Name, p.ID)
if det, ok := creds[p.ID]; ok {
label += fmt.Sprintf(" — detected: %s", det.source)
}
options[i] = label
}

idx, err := u.Select("Select a provider:", options, 0)
if err != nil {
return err
}
provider = providers[idx].ID

// If a credential was detected for the chosen provider, offer to use it
if det, ok := creds[provider]; ok && det.key != "" && apiKey == "" {
u.Infof("%s API key detected (%s)", providers[idx].Name, det.source)
if u.Confirm("Use detected credential?", true) {
apiKey = det.key
}
}
}

// Provider-specific flow
Expand Down Expand Up @@ -370,6 +383,40 @@ func providerInfo(id string) model.ProviderInfo {
return model.ProviderInfo{ID: id, Name: id}
}

// detectedCredential describes a credential found in the environment.
type detectedCredential struct {
key string // the actual API key value (empty for Ollama)
source string // human-readable description of where it was found
}

// detectCredentials checks the environment for existing provider credentials.
// It returns a map of provider ID to detected credential info. Only providers
// with a detected credential appear in the map.
func detectCredentials() map[string]detectedCredential {
creds := make(map[string]detectedCredential)

// Anthropic: check ANTHROPIC_API_KEY, then CLAUDE_CODE_OAUTH_TOKEN
if key := os.Getenv("ANTHROPIC_API_KEY"); key != "" {
creds["anthropic"] = detectedCredential{key: key, source: "ANTHROPIC_API_KEY"}
} else if key := os.Getenv("CLAUDE_CODE_OAUTH_TOKEN"); key != "" {
creds["anthropic"] = detectedCredential{key: key, source: "CLAUDE_CODE_OAUTH_TOKEN"}
}

// OpenAI: check OPENAI_API_KEY
if key := os.Getenv("OPENAI_API_KEY"); key != "" {
creds["openai"] = detectedCredential{key: key, source: "OPENAI_API_KEY"}
}

// Ollama: check if reachable with models
if ollamaModels, err := model.ListOllamaModels(); err == nil && len(ollamaModels) > 0 {
creds["ollama"] = detectedCredential{
source: fmt.Sprintf("%d model(s) available", len(ollamaModels)),
}
}

return creds
}

// promptModelPull interactively asks the user which Ollama model to pull.
func promptModelPull() (string, error) {
type suggestion struct {
Expand Down
9 changes: 8 additions & 1 deletion obolup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -1531,9 +1531,16 @@ except: pass
return 0
fi

# Anthropic-specific fallback: Claude Code subscription token
if [[ "$provider" == "anthropic" && -n "${CLAUDE_CODE_OAUTH_TOKEN:-}" ]]; then
export ANTHROPIC_API_KEY="$CLAUDE_CODE_OAUTH_TOKEN"
log_success "Claude Code subscription detected (CLAUDE_CODE_OAUTH_TOKEN)"
return 0
fi

# Interactive: prompt for the API key (like hermes-agent's setup wizard)
if [[ -c /dev/tty ]]; then
log_success "Your agent uses $primary_model ($provider_name)"
log_info "Your agent uses $primary_model ($provider_name)"
echo ""
local api_key=""
read -r -p " $provider_name API key ($env_var): " api_key </dev/tty
Expand Down
Loading