Skip to content

[Bug]: 从4.9.4升级到4.9.5又无法切换本地部署的LLM #2111

@hicocsco

Description

@hicocsco

Runtime environment

v4.9.5,ubuntu24.04,Python 3.12.3

Exception

使用v4.9.4时可以正常切换内网部署的ollama平台,在线升级到4.9.5时切换模型报错

Image 本地模型验证成功 Image 流水线中指定本地大模型并保存 Image 流水线调试窗口报错,切合切换为阿里云百炼仍然报相同的错误

Reproduction steps

No response

Enabled plugins

No response

Metadata

Metadata

Labels

bug?Bug或Bug修复相关 / maybe a bugm: ProviderLLM 模型相关 / LLMs managementpd: Need reproducingpending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproduce

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions