From 7b10354962049d66bc25ea42613faea3b9eadce3 Mon Sep 17 00:00:00 2001 From: bussyjd Date: Tue, 17 Mar 2026 17:09:42 +0800 Subject: [PATCH] feat: detect CLAUDE_CODE_OAUTH_TOKEN as Anthropic API key fallback obolup.sh check_agent_model_api_key() now checks CLAUDE_CODE_OAUTH_TOKEN when ANTHROPIC_API_KEY is missing, so developers with Claude Code subscriptions skip the interactive prompt. Also documents the LLM auto-configuration flow and OpenClaw version management in CLAUDE.md. Closes #272 (Unit 3) --- CLAUDE.md | 11 +++++++++++ obolup.sh | 7 +++++++ 2 files changed, 18 insertions(+) diff --git a/CLAUDE.md b/CLAUDE.md index 1de55441..e0ed19a9 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -107,6 +107,8 @@ k3d: 1 server, ports 80:80 + 8080:80 + 443:443 + 8443:443, `rancher/k3s:v1.35.1- **LiteLLM gateway** (`llm` ns, port 4000): OpenAI-compatible proxy routing to Ollama/Anthropic/OpenAI. ConfigMap `litellm-config` (YAML config.yaml with model_list), Secret `litellm-secrets` (master key + API keys). Auto-configured with Ollama models during `obol stack up` (no manual `obol model setup` needed). `ConfigureLiteLLM()` patches config + Secret + restarts. Custom endpoints: `obol model setup custom --name --endpoint --model` (validates before adding). Paid remote inference stays on vanilla LiteLLM with a static route `paid/* -> openai/* -> http://127.0.0.1:8402`; no LiteLLM fork is required. OpenClaw always routes through LiteLLM (openai provider slot), never native providers; `dangerouslyDisableDeviceAuth` is enabled for Traefik-proxied access. +**Auto-configuration**: During `obol stack up`, `autoConfigureLLM()` detects host Ollama models and patches LiteLLM config so agent chat works immediately without manual `obol model setup`. During install, `obolup.sh` `check_agent_model_api_key()` reads `~/.openclaw/openclaw.json` agent model, resolves API key from environment (`ANTHROPIC_API_KEY`, `CLAUDE_CODE_OAUTH_TOKEN` for Anthropic; `OPENAI_API_KEY` for OpenAI), and exports it for downstream tools. + **Per-instance overlay**: `buildLiteLLMRoutedOverlay()` reuses "ollama" provider slot pointing at `litellm.llm.svc:4000/v1` with `api: openai-completions`. App → litellm:4000 → routes by model name → actual API. ## Standalone Inference Gateway @@ -148,6 +150,15 @@ Skills = SKILL.md + optional scripts/references, embedded in `obol` binary (`int 4. **`OBOL_DEVELOPMENT=true`** — required for `obol stack up` to auto-build local images (x402-verifier, x402-buyer) 5. **Root-owned PVCs** — `-f` flag required to remove in `obol stack purge` +### OpenClaw Version Management + +Three places pin the OpenClaw version — all must agree: +1. `internal/openclaw/OPENCLAW_VERSION` — source of truth (Renovate watches, CI reads) +2. `internal/openclaw/openclaw.go` — `openclawImageTag` constant +3. `obolup.sh` — `OPENCLAW_VERSION` shell constant for standalone installs + +`TestOpenClawVersionConsistency` in `internal/openclaw/version_test.go` catches drift. + ### Pitfalls 1. **Kubeconfig port drift** — k3d API port can change between restarts. Fix: `k3d kubeconfig write -o .workspace/config/kubeconfig.yaml --overwrite` diff --git a/obolup.sh b/obolup.sh index 6b237142..28ade1e1 100755 --- a/obolup.sh +++ b/obolup.sh @@ -1531,6 +1531,13 @@ except: pass return 0 fi + # Anthropic-specific fallback: Claude Code subscription token + if [[ "$provider" == "anthropic" && -n "${CLAUDE_CODE_OAUTH_TOKEN:-}" ]]; then + export ANTHROPIC_API_KEY="$CLAUDE_CODE_OAUTH_TOKEN" + log_success "Claude Code subscription detected (CLAUDE_CODE_OAUTH_TOKEN)" + return 0 + fi + # Interactive: prompt for the API key (like hermes-agent's setup wizard) if [[ -c /dev/tty ]]; then log_info "Your agent uses $primary_model ($provider_name)"