Skip to content

feat: add oauth-backed OpenAI Codex routing#513

Open
l3wi wants to merge 2 commits intoagentjido:mainfrom
l3wi:feat/oauth-openai-routing
Open

feat: add oauth-backed OpenAI Codex routing#513
l3wi wants to merge 2 commits intoagentjido:mainfrom
l3wi:feat/oauth-openai-routing

Conversation

@l3wi
Copy link
Contributor

@l3wi l3wi commented Mar 16, 2026

Summary

  • add a first-class openai_codex provider that targets the ChatGPT Codex backend at /backend-api/codex/responses
  • extend OAuth file handling so oauth.json / auth.json can return and persist accountId, refresh Codex credentials, and derive account ids from JWTs
  • add model fallback and response-builder wiring so openai_codex:* works with normal ReqLLM APIs, including generate_text/3

What changed

  • added ReqLLM.Providers.OpenAICodex as an OAuth-only provider with Codex-specific headers, request shaping, SSE normalization, and non-streaming SSE decode
  • added openai_codex:* model resolution by reusing OpenAI catalog metadata and stamping openai_codex_responses wire metadata
  • extended auth plumbing so resolved OAuth credentials include account_id, and Codex requests can source it from opts, auth files, or JWT claims
  • documented the new provider and OAuth-file flow in the OpenAI guide and README
  • added provider, auth, generation, and model-resolution tests for the new path

Validation

  • mix quality
  • mix test test/providers/openai_codex_test.exs test/req_llm_test.exs test/req_llm/auth_test.exs test/providers/openai_test.exs test/provider/openai/responses_api_unit_test.exs test/req_llm/generation_test.exs
  • live smoke check via ReqLLM.generate_text("openai_codex:gpt-5.3-codex-spark", ...) with an OAuth file ✅

Existing repo issues

  • mix test still has one pre-existing failure in test/providers/google_vertex_embedding_test.exs:107 on this branch, same issue observed before this change
  • mix mc "*:*" currently fails broadly due existing fixture gaps / missing recorded fixtures across many providers in the repo; I did not change fixture coverage in this PR

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f7862d6d58

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

def attach_stream(model, context, opts, _finch_name) do
ensure_oauth_mode!(opts)

credential = ReqLLM.Auth.resolve!(model, opts)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Pass OAuth mode when resolving Codex stream credentials

openai_codex is defined as OAuth-only, but this call resolves credentials with raw opts and does not force auth_mode: :oauth; ReqLLM.Auth.resolve!/2 defaults to API-key mode when auth_mode is missing. In the streaming path (ReqLLM.stream_text/3), options are not run through Provider.Options.process/4, so provider defaults are never injected, and a call like provider_options: [oauth_file: "..."] fails with missing API-key errors even though OAuth credentials are present.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant