Skip to content

Conversation

@aibrahim-oai
Copy link
Collaborator

@aibrahim-oai aibrahim-oai commented Oct 30, 2025

This will help us keep cache when delegating to Codex using the same history. i.e. in compact tasks

It also helps us associating forked task to the same prompt cache key

The change basically adds prompt_cache_key as an optional parameter on Codex::Spawn. Then, we pass it to Session::new. We add the parameter as a member of Session struct (source of truth). Then, we add it on client to propagate it to the API. client is constructed every turn from Session

@aibrahim-oai aibrahim-oai changed the title prompt cache key Add prompt cache key as a parameter when starting Codex Oct 30, 2025
@aibrahim-oai aibrahim-oai marked this pull request as ready for review October 30, 2025 05:20
@aibrahim-oai
Copy link
Collaborator Author

@codex review this

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@aibrahim-oai
Copy link
Collaborator Author

@codex review this

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +193 to +197
auth_manager,
history,
self.session_source.clone(),
Some(conversation_id.to_string()),
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Reuse ancestor prompt cache key when forking

Passing Some(conversation_id.to_string()) here only works for the very first fork of a base conversation, where the prompt cache key equals the conversation id. Once you fork a fork (or any conversation whose session was started with a custom prompt cache key), the current conversation's cache key is inherited from its ancestor via Session::get_prompt_cache_key(), not from its own id. By hard-coding the new fork to use its immediate conversation id as the key, we break cache affinity for nested forks and lose the optimization this change is trying to introduce. We should look up the existing conversation's prompt cache key and propagate that through Codex::spawn instead of defaulting to the conversation id.

Useful? React with 👍 / 👎.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's true. I need to think about it more tomorrow.

@github-actions github-actions bot locked and limited conversation to collaborators Oct 30, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants