Skip to content

fix: reroute /v1/responses with ChatGPT-mode JWT to chatgpt.com codex backend#183

Merged
arniesaha merged 1 commit intomainfrom
fix/codex-jwt-route-to-chatgpt-backend
May 3, 2026
Merged

fix: reroute /v1/responses with ChatGPT-mode JWT to chatgpt.com codex backend#183
arniesaha merged 1 commit intomainfrom
fix/codex-jwt-route-to-chatgpt-backend

Conversation

@arniesaha
Copy link
Copy Markdown
Owner

@arniesaha arniesaha commented May 3, 2026

Summary

  • OpenClaw's openai-codex provider always emits POST /v1/responses regardless of the configured upstream base URL. When the bearer token is a ChatGPT-mode JWT (eyJ...) rather than an sk-* API key, api.openai.com rejects the call — those tokens only work against chatgpt.com/backend-api/codex/*.
  • This change detects JWT-shaped bearer tokens hitting /v1/responses in the proxy and reroutes them to codex/responses (which already dispatches to the ChatGPT backend). sk-* API keys and the chat/completions surface are unchanged.
  • All other providers (anthropic, codex, etc.) pass through untouched.

Why

OpenClaw users running ChatGPT subscription auth through the agentweave proxy currently get auth-rejected at OpenAI. Without this reroute, the only workaround is to hand-edit the openclaw provider catalog to point at codex/responses, which fights the plugin's normalization.

Implementation

  • _is_chatgpt_mode_bearer(auth_header) — true iff token starts with eyJ and not sk- (Anthropic sk-ant-* keys are explicitly excluded by the sk- check).
  • _maybe_reroute_openai_to_codex(provider, path, auth_header) — narrow rewrite for provider == "openai" && path == "v1/responses" && JWT bearer. Returns (provider, path); no-op for everything else.
  • Wired into the request handler immediately after _detect_provider, so _extract_model, streaming detection, and upstream URL building all see the rewritten provider/path.

Test plan

  • Added TestChatGPTModeBearerDetection (5 cases): JWT with/without Bearer prefix, sk-* keys, sk-ant-*, empty header.
  • Added TestMaybeRerouteOpenAIToCodex (6 cases): reroute happy path, sk-key passthrough, no-auth passthrough, chat/completions unaffected, anthropic unaffected, codex provider unaffected.
  • Existing 162-test suite still passes.
  • End-to-end verification against a live OpenClaw codex session is pending — see openclaw-side silent-skip investigation (codex provider currently never emits /v1/responses traffic to exercise this path).

🤖 Generated with Claude Code

… backend

OpenClaw's openai-codex provider always emits POST /v1/responses regardless of
the upstream baseUrl. When the bearer token is a ChatGPT-mode JWT (eyJ...)
rather than an sk-* API key, api.openai.com rejects the call. ChatGPT-mode
tokens only work against chatgpt.com/backend-api/codex/*.

Detect JWT-shaped bearer tokens at /v1/responses and reroute them to
codex/responses (which dispatches to the ChatGPT backend), leaving sk-* keys
and the chat/completions surface unchanged.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@arniesaha arniesaha merged commit 92e6b1f into main May 3, 2026
5 checks passed
@arniesaha arniesaha deleted the fix/codex-jwt-route-to-chatgpt-backend branch May 3, 2026 04:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant