-
Notifications
You must be signed in to change notification settings - Fork 898
Different behaviour with Gemini models using OpenAI+OpenRouter #1735
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@ChenghaoMou Since you're using the You can do so manually with a custom from pydantic_ai.models import ModelRequestParameters
from pydantic_ai.models._json_schema import JsonSchema
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.models.gemini import _GeminiJsonSchema
from pydantic_ai.settings import ModelSettings
from pydantic_ai.tools import ToolDefinition
class ORGeminiModel(OpenAIModel):
def customize_request_parameters(self, model_request_parameters: ModelRequestParameters) -> ModelRequestParameters:
def _customize_tool_def(t: ToolDefinition):
return replace(t, parameters_json_schema=_GeminiJsonSchema(t.parameters_json_schema).walk())
return ModelRequestParameters(
function_tools=[_customize_tool_def(tool) for tool in model_request_parameters.function_tools],
allow_text_output=model_request_parameters.allow_text_output,
output_tools=[_customize_tool_def(tool) for tool in model_request_parameters.output_tools],
) Let me know if that works. I'll be filing an issue later to start working out a more general solution to this problem of using a model class built for a specific API (OpenAI) and model (OpenAI models) with different actual APIs (OpenRouter) and models (Gemini), which can result in unexpected behavior like what you're seeing here. |
@DouweM Thanks a ton for the help! I can confirm that it solves my issue. Feel free to close this issue for your later one. |
@ChenghaoMou With the changes in #1835, which include a new model = OpenAIModel(
"google/gemini-2.0-flash-001", provider=OpenRouterProvider()
)
agent = Agent(model) Or this: model = OpenAIModel("google/gemini-2.0-flash-001", provider="openrouter")
agent = Agent(model) Or even this: agent = Agent("openrouter:google/gemini-2.0-flash-001") If you have a chance, could you verify that that works as expected? |
@DouweM thanks for the quick turnaround! I have tested your branch with Here is an issue I found:
the model profile function seems to expect
But if I change to |
@ChenghaoMou Good catch, that's what I get for leaving tests to the end :) Fixed in the PR! |
Initial Checks
Description
When using a gemini model as follows:
It creates a different
parameters_json_schema
that causes the model repeatedly fail to follow the schema, while the GeminiModel + VertexProvider rarely fails.Left: OpenAIModel with OpenAIProvider/OpenRouter
Right: GeminiModel with VertexProvider
Error Message
Example Code
Python, Pydantic AI & LLM client version
The text was updated successfully, but these errors were encountered: