Skip to content

[Bug]: 流水线中无法切换本地ollama模型 #2040

@hicocsco

Description

@hicocsco

Runtime environment

V4.8.7,Ubuntu 24.04.4,docker

Exception

Image Image Image Image

Reproduction steps

No response

Enabled plugins

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug?Bug或Bug修复相关 / maybe a bugm: ProviderLLM 模型相关 / LLMs managementpd: Need reproducingpending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproduce

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions