We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug description 在调用API时报错,request timed out 【ERROR | app.llm:ask_tool:260 - API error: Request timed out.】
Environment information Win 11 python 3.12 API:Ollama(局域网服务器部署,Win Server 2022) model:qwq 未开启魔法上网 ################################################################# [llm] model = "qwq:latest" base_url = "http://10.0.0.****:11434/v1" api_key = "ollama" max_tokens = 24000 temperature = 0.8
[llm.vision] model = "qwq:latest" base_url = "http://10.0.0.****:11434/v1" api_key = "ollama" #################################################################
The text was updated successfully, but these errors were encountered:
same error
Sorry, something went wrong.
me too
我发现qwq等推理模型的用时比较长,可能是思考过程耗时太久被判断超时了。目前我修改了OpenManus\app\llm.py的188行( timeout: int = 60,) 将: ##################################################### async def ask_tool( self, messages: List[Union[dict, Message]], system_msgs: Optional[List[Union[dict, Message]]] = None, timeout: int = 60, tools: Optional[List[dict]] = None, tool_choice: Literal["none", "auto", "required"] = "auto", temperature: Optional[float] = None, **kwargs, ): ##################################################### 改为 ##################################################### async def ask_tool( self, messages: List[Union[dict, Message]], system_msgs: Optional[List[Union[dict, Message]]] = None, timeout: int = 240, tools: Optional[List[dict]] = None, tool_choice: Literal["none", "auto", "required"] = "auto", temperature: Optional[float] = None, **kwargs, ): ##################################################### 目前测试是可以使用的,但以后主调函数若有修改未必能起效,仅供参考
No branches or pull requests
Bug description

在调用API时报错,request timed out
【ERROR | app.llm:ask_tool:260 - API error: Request timed out.】
Environment information
Win 11
python 3.12
API:Ollama(局域网服务器部署,Win Server 2022)
model:qwq
未开启魔法上网
#################################################################
[llm]
model = "qwq:latest"
base_url = "http://10.0.0.****:11434/v1"
api_key = "ollama"
max_tokens = 24000
temperature = 0.8
[llm.vision]
model = "qwq:latest"
base_url = "http://10.0.0.****:11434/v1"
api_key = "ollama"
#################################################################
The text was updated successfully, but these errors were encountered: