Skip to content

[Issue]: Send additional parameter to model #1615

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bhuvnesh-vcti opened this issue Apr 11, 2025 · 0 comments
Open

[Issue]: Send additional parameter to model #1615

bhuvnesh-vcti opened this issue Apr 11, 2025 · 0 comments

Comments

@bhuvnesh-vcti
Copy link

bhuvnesh-vcti commented Apr 11, 2025

Describe the issue

Hi, I was using pyautogen==0.7.3 with a openAI model deployed on private azure deployment server.

with following llm_config, everything was working fine.

llm_config = {
            "model": model,
            "api_key": apikey,
            "base_url": URL,
            "api_type": "azure",
            "api_version" : model_api_version,
            "max_retries": 3,
            "timeout": 60,
            "user" : f'{{"appkey": "{app_key}"}}'
    }

But with the latest pyautogen version I'm getting error:

ValidationError: 2 validation errors for _LLMConfig
config_list.0.azure.max_retries
  Extra inputs are not permitted [type=extra_forbidden, input_value=3, input_type=int]
    For further information visit https://errors.pydantic.dev/2.11/v/extra_forbidden
config_list.0.azure.user
  Extra inputs are not permitted [type=extra_forbidden, input_value='{"appkey": "egai-prd-net...orkflow-1739370934494"}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/extra_forbidden

How to pass these parameter in the newer pyautogen?

Here how it is used using openAI package

import os 
from openai 
import AzureOpenAI 

client = AzureOpenAI( azure_endpoint = 'https://chat-ai.cisco.com', 
api_key=token_response.json()["access_token"], 
api_version="2024-08-01-preview" ) 

response = client.chat.completions.create( model="gpt-4o-mini", 
# model = "deployment_name",
 messages=message_with_history,
 user=f'{{"appkey": "{app_key}"}}' )

Steps to reproduce

No response

Screenshots and logs

---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[2], line 12
      2 from typing import Dict, List, Optional, Union
      4 # Assume llm_config is defined
      5 # llm_config = { ... }
      6 # --- End Transcripts ---
   (...)
     10 
     11 # 1. Information Extractor Agent
---> 12 extractor = autogen.AssistantAgent(
     13     name="Information_Extractor",
     14     llm_config=llm_config,
     15     system_message="""You are a meticulous information extractor specializing in technical meeting transcripts (e.g., scrum, planning, architecture).
....
     33 """
     34 )
     36 # 2. Document Synthesizer Agent (Writer)
     37 synthesizer = autogen.AssistantAgent(
     38     name="Document_Synthesizer",
     39     llm_config=llm_config,
   (...)
     62 - Start your response directly with the first heading: "# Requirements"."""
     63 )

File ~/miniconda3/envs/npyautogen/lib/python3.12/site-packages/autogen/agentchat/assistant_agent.py:69, in AssistantAgent.__init__(self, name, system_message, llm_config, is_termination_msg, max_consecutive_auto_reply, human_input_mode, description, **kwargs)
     42 def __init__(
     43     self,
     44     name: str,
   (...)
     51     **kwargs: Any,
     52 ):
     53     """Args:
     54     name (str): agent name.
     55     system_message (str): system message for the ChatCompletion inference.
   (...)
     67         [ConversableAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/ConversableAgent).
     68     """
---> 69     super().__init__(
     70         name,
     71         system_message,
     72         is_termination_msg,
     73         max_consecutive_auto_reply,
     74         human_input_mode,
     75         llm_config=llm_config,
     76         description=description,
     77         **kwargs,
     78     )
     79     if logging_enabled():
     80         log_new_agent(self, locals())

File ~/miniconda3/envs/npyautogen/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:270, in ConversableAgent.__init__(self, name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, default_auto_reply, description, chat_messages, silent, context_variables, functions, update_agent_state_before_reply)
    264     except TypeError as e:
    265         raise TypeError(
    266             "Please implement __deepcopy__ method for each value class in llm_config to support deepcopy."
    267             " Refer to the docs for more details: https://docs.ag2.ai/docs/user-guide/advanced-concepts/llm-configuration-deep-dive/#adding-http-client-in-llm_config-for-proxy"
    268         ) from e
--> 270 self.llm_config = self._validate_llm_config(llm_config)
    271 self.client = self._create_client(self.llm_config)
    272 self._validate_name(name)

File ~/miniconda3/envs/npyautogen/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:498, in ConversableAgent._validate_llm_config(cls, llm_config)
    496         llm_config = cls.DEFAULT_CONFIG
    497 elif isinstance(llm_config, dict):
--> 498     llm_config = LLMConfig(**llm_config)
    499 elif isinstance(llm_config, LLMConfig):
    500     llm_config = llm_config.copy()

File ~/miniconda3/envs/npyautogen/lib/python3.12/site-packages/autogen/llm_config.py:84, in LLMConfig.__init__(self, **kwargs)
     81         modified_kwargs["config_list"] = [{**v, x: modified_kwargs[x]} for v in modified_kwargs["config_list"]]
     82         modified_kwargs.pop(x)
---> 84 self._model = self._get_base_model_class()(**modified_kwargs)

File ~/miniconda3/envs/npyautogen/lib/python3.12/site-packages/pydantic/main.py:253, in BaseModel.__init__(self, **data)
    251 # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    252 __tracebackhide__ = True
--> 253 validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    254 if self is not validated_self:
    255     warnings.warn(
    256         'A custom validator is returning a value other than `self`.\n'
    257         "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
    258         'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
    259         stacklevel=2,
    260     )

ValidationError: 2 validation errors for _LLMConfig
config_list.0.azure.max_retries
  Extra inputs are not permitted [type=extra_forbidden, input_value=3, input_type=int]
    For further information visit https://errors.pydantic.dev/2.11/v/extra_forbidden
config_list.0.azure.user
  Extra inputs are not permitted [type=extra_forbidden, input_value='{"appkey": "egai-prd-net...orkflow-1739370934494"}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/extra_forbidden

Additional Information

Package Name & Version: pyautogen==0.7.3
Operating System: Ubuntu 22.04.5 LTS
Python Version: 3.12

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant