Skip to content

fix(model): preserve model UUID in runtime after update, fixing "No LLM model configured" error#2138

Draft
Copilot wants to merge 2 commits intomasterfrom
copilot/bugfix-switching-llm-error
Draft

fix(model): preserve model UUID in runtime after update, fixing "No LLM model configured" error#2138
Copilot wants to merge 2 commits intomasterfrom
copilot/bugfix-switching-llm-error

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 19, 2026

概述 / Overview

Saving a model's configuration via the UI caused subsequent pipeline runs to fail with RuntimeError: No LLM model configured for local-agent runner. Root cause: update_llm_model (and update_embedding_model) correctly strips uuid from model_data before the SQL UPDATE, but then passes that same uuid-less dict to construct the in-memory LLMModel entity — leaving model_entity.uuid = None in the runtime model list. Every subsequent get_model_by_uuid(uuid) call fails to match it, producing the warning LLM model {uuid} not found or not configured and ultimately crashing the pipeline.

Fix (pkg/api/http/service/model.py):

# Before — runtime entity gets uuid=None
runtime_llm_model = await self.ap.model_mgr.load_llm_model_with_provider(
    persistence_model.LLMModel(**model_data),
    runtime_provider,
)

# After — uuid explicitly restored from the URL parameter
runtime_llm_model = await self.ap.model_mgr.load_llm_model_with_provider(
    persistence_model.LLMModel(uuid=model_uuid, **model_data),
    runtime_provider,
)

Same fix applied to update_embedding_model.

更改前后对比截图 / Screenshots

修改前 / Before:

RuntimeError: No LLM model configured for local-agent runner after saving model config

修改后 / After:

Pipeline correctly resolves the model by UUID and runs normally

检查清单 / Checklist

PR 作者完成 / For PR author

请在方括号间写x以打勾 / Please tick the box with x

  • 阅读仓库贡献指引了吗? / Have you read the contribution guide?
  • 与项目所有者沟通过了吗? / Have you communicated with the project maintainer?
  • 我确定已自行测试所作的更改,确保功能符合预期。 / I have tested the changes and ensured they work as expected.

项目维护者完成 / For project maintainer

  • 相关 issues 链接了吗? / Have you linked the related issues?
  • 配置项写好了吗?迁移写好了吗?生效了吗? / Have you written the configuration items? Have you written the migration? Has it taken effect?
  • 依赖加到 pyproject.toml 和 core/bootutils/deps.py 了吗 / Have you added the dependencies to pyproject.toml and core/bootutils/deps.py?
  • 文档编写了吗? / Have you written the documentation?

Copilot AI changed the title [WIP] Fix bug with switching local LLM on upgrade to 4.9.5 fix(model): preserve model UUID in runtime after update, fixing "No LLM model configured" error Apr 19, 2026
Copilot AI requested a review from RockChinQ April 19, 2026 14:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: 从4.9.4升级到4.9.5又无法切换本地部署的LLM

2 participants