Skip to content

openai[patch]: support built-in code interpreter and remote MCP tools #31304

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 13 commits into
base: master
Choose a base branch
from

Conversation

ccurme
Copy link
Collaborator

@ccurme ccurme commented May 21, 2025

No description provided.

Copy link

vercel bot commented May 21, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 1 unresolved May 22, 2025 1:43am

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. langchain Related to the langchain package labels May 21, 2025
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds support for OpenAI’s built-in code interpreter and remote MCP tools in the ChatOpenAI responses API integration.

  • Introduces new integration tests for code interpreter and MCP flows.
  • Updates core request/response handling to include code_interpreter and mcp tool calls.
  • Extends function-calling utilities and documentation to cover the new built-in tools.

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.

File Description
libs/partners/openai/tests/integration_tests/chat_models/test_responses_api.py Added tests for code interpreter and MCP tool bindings.
libs/partners/openai/langchain_openai/chat_models/base.py Enabled code_interpreter and mcp in bind_tools and request/response builders.
libs/core/langchain_core/utils/function_calling.py Updated convert_to_openai_tool to accept new tool types.
docs/docs/integrations/chat/openai.ipynb Added notebook examples for code interpreter and remote MCP.
Comments suppressed due to low confidence (2)

libs/partners/openai/tests/integration_tests/chat_models/test_responses_api.py:406

  • The Optional type is used here but not imported in this file. Add from typing import Optional to the imports.
full: Optional[BaseMessageChunk] = None

libs/partners/openai/langchain_openai/chat_models/base.py:3169

  • Built-in tool calls for computer_call are collected in computer_calls but never appended to input_. If preserving computer_call behavior, add input_.extend(computer_calls) alongside the new extends.
input_.extend(code_interpreter_calls)

Copy link
Collaborator

@eyurtsev eyurtsev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Main thing is whether bearer tokens will be sent as part of traces if so, we'll need to prevent that from happening

@@ -553,10 +553,16 @@ def convert_to_openai_tool(
Return OpenAI Responses API-style tools unchanged. This includes
any dict with "type" in "file_search", "function", "computer_use_preview",
"web_search_preview".
"web_search_preview", "code_interpreter", or "mcp".
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be

.. versionchanged:: 0.3.60

"source": [
"tool_outputs = response.additional_kwargs[\"tool_outputs\"]\n",
"assert len(tool_outputs) == 1\n",
"container_id = tool_outputs[0][\"container_id\"]\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe highlight this one as well? I might be just tired, but I didn't notice this line when I was skimming through the code snippet looking directly at the highlighted line below

" \"role\": \"user\",\n",
" \"content\": [\n",
" {\n",
" \"type\": \"mcp_approval_response\",\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to pass it directly in openai's current format or do we want to try to generalize?

There are a lot of ways to represent this part

{
"type": "review",
"approve": True,
}

{
"type": "review",
"action": "approved",

}


Historically choice has always been:

  1. try to support syntax as closely as possible to provider
  2. do not generalize from n=1

cc @sydney-runkle worth for you to take a look at the representation as well

@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label May 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
langchain Related to the langchain package lgtm PR looks good. Use to confirm that a PR is ready for merging. size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants