feat: add oauth-backed OpenAI Codex routing#513
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f7862d6d58
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| def attach_stream(model, context, opts, _finch_name) do | ||
| ensure_oauth_mode!(opts) | ||
|
|
||
| credential = ReqLLM.Auth.resolve!(model, opts) |
There was a problem hiding this comment.
Pass OAuth mode when resolving Codex stream credentials
openai_codex is defined as OAuth-only, but this call resolves credentials with raw opts and does not force auth_mode: :oauth; ReqLLM.Auth.resolve!/2 defaults to API-key mode when auth_mode is missing. In the streaming path (ReqLLM.stream_text/3), options are not run through Provider.Options.process/4, so provider defaults are never injected, and a call like provider_options: [oauth_file: "..."] fails with missing API-key errors even though OAuth credentials are present.
Useful? React with 👍 / 👎.
Summary
openai_codexprovider that targets the ChatGPT Codex backend at/backend-api/codex/responsesoauth.json/auth.jsoncan return and persistaccountId, refresh Codex credentials, and derive account ids from JWTsopenai_codex:*works with normalReqLLMAPIs, includinggenerate_text/3What changed
ReqLLM.Providers.OpenAICodexas an OAuth-only provider with Codex-specific headers, request shaping, SSE normalization, and non-streaming SSE decodeopenai_codex:*model resolution by reusing OpenAI catalog metadata and stampingopenai_codex_responseswire metadataaccount_id, and Codex requests can source it from opts, auth files, or JWT claimsValidation
mix quality✅mix test test/providers/openai_codex_test.exs test/req_llm_test.exs test/req_llm/auth_test.exs test/providers/openai_test.exs test/provider/openai/responses_api_unit_test.exs test/req_llm/generation_test.exs✅ReqLLM.generate_text("openai_codex:gpt-5.3-codex-spark", ...)with an OAuth file ✅Existing repo issues
mix teststill has one pre-existing failure intest/providers/google_vertex_embedding_test.exs:107on this branch, same issue observed before this changemix mc "*:*"currently fails broadly due existing fixture gaps / missing recorded fixtures across many providers in the repo; I did not change fixture coverage in this PR