-
Notifications
You must be signed in to change notification settings - Fork 1.4k
[Bug]: 从4.9.4升级到4.9.5又无法切换本地部署的LLM #2111
Copy link
Copy link
Open
Labels
bug?Bug或Bug修复相关 / maybe a bugBug或Bug修复相关 / maybe a bugm: ProviderLLM 模型相关 / LLMs managementLLM 模型相关 / LLMs managementpd: Need reproducingpending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproducepending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproduce
Metadata
Metadata
Labels
bug?Bug或Bug修复相关 / maybe a bugBug或Bug修复相关 / maybe a bugm: ProviderLLM 模型相关 / LLMs managementLLM 模型相关 / LLMs managementpd: Need reproducingpending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproducepending: 需要测试以复现的issue,若您遇到相同问题,请提供更多的有价值的信息 / Please add more info as you can for us to reproduce
Runtime environment
v4.9.5,ubuntu24.04,Python 3.12.3
Exception
使用v4.9.4时可以正常切换内网部署的ollama平台,在线升级到4.9.5时切换模型报错
Reproduction steps
No response
Enabled plugins
No response