Skip to content

Conversation

@Soulter
Copy link
Member

@Soulter Soulter commented Jan 15, 2026

Modifications / 改动点

  • This is NOT a breaking change. / 这不是一个破坏性变更。

Screenshots or Test Results / 运行截图或测试结果


Checklist / 检查清单

  • 😊 如果 PR 中有新加入的功能,已经通过 Issue / 邮件等方式和作者讨论过。/ If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
  • 👀 我的更改经过了良好的测试,并已在上方提供了“验证步骤”和“运行截图”。/ My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
  • 🤓 我确保没有引入新依赖库,或者引入了新依赖库的同时将其添加到了 requirements.txtpyproject.toml 文件相应位置。/ I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in requirements.txt and pyproject.toml.
  • 😮 我的更改没有引入恶意代码。/ My changes do not introduce malicious code.

Summary by Sourcery

确保 Webchat 流式响应能通过 message ID 正确关联到其原始请求。

Bug Fixes:

  • 通过基于共享的 message_id 进行过滤,防止在并发会话中出现 Webchat 流式消息交错或错配的问题。

Enhancements:

  • 在 Webchat 事件、控制台(dashboard)流式响应以及适配器消息中传播稳定的 message_id,以便为所有发出的消息提供一致的标记。
Original summary in English

Summary by Sourcery

Ensure webchat streaming responses are correctly associated with their originating requests using message IDs.

Bug Fixes:

  • Prevent interleaving or mismatching of webchat streaming messages across concurrent sessions by filtering on a shared message_id.

Enhancements:

  • Propagate a stable message_id through webchat events, dashboard streaming responses, and adapter messages to consistently tag outgoing messages.

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Jan 15, 2026
@Soulter Soulter merged commit 0e3d224 into master Jan 15, 2026
4 of 5 checks passed
@Soulter Soulter deleted the fix/chatui-message-order branch January 15, 2026 05:11
Soulter added a commit that referenced this pull request Jan 15, 2026
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - 我发现了两个问题,并给出了一些整体性的反馈:

  • convert_message 中,当 payload.get("message_id") 缺失时回退到生成一个新的 message_id,可以保持对旧版客户端的向后兼容,并避免 None 类型的 ID 在系统中传播。
  • chat.stream 中,可以考虑把 message_id 不匹配的日志级别降到 debug,或者在日志中加入更多上下文信息(例如两个 ID、用户/会话信息),这样既能减少噪声警告,又能保证跨请求混流问题仍然可诊断。
给 AI Agent 的提示词
请根据以下代码审查意见进行修改:

## 总体意见
-`convert_message` 中,当 `payload.get("message_id")` 缺失时回退到生成一个新的 `message_id`,可以保持对旧版客户端的向后兼容,并避免 `None` 类型的 ID 在系统中传播。
-`chat.stream` 中,可以考虑把 `message_id` 不匹配的日志级别降到 debug,或者在日志中加入更多上下文信息(例如两个 ID、用户/会话信息),这样既能减少噪声警告,又能保证跨请求混流问题仍然可诊断。

## 具体评论

### 评论 1
<location> `astrbot/core/platform/sources/webchat/webchat_adapter.py:200` </location>
<code_context>
         abm.session_id = f"webchat!{username}!{cid}"

-        abm.message_id = str(uuid.uuid4())
+        abm.message_id = payload.get("message_id")

         # 处理消息段列表
</code_context>

<issue_to_address>
**issue (bug_risk):** 直接依赖客户端提供的 `message_id`,可能会引入 `None` / 无效 ID 以及信任问题。

之前这里始终是由服务端生成的 UUID,因此下游代码可以依赖 `message_id` 一定存在、非空且唯一。现在直接使用客户端提供的值,它可能缺失、为 `None`,或者是任意值,这会破坏原有假设并削弱对关联 ID 的信任度。请增加基础校验(存在性 / 类型 / 格式),并在客户端值缺失或无效时回退到服务端生成的 UUID。
</issue_to_address>

### 评论 2
<location> `astrbot/dashboard/routes/chat.py:299` </location>
<code_context>
         # 构建用户消息段(包含 path 用于传递给 adapter)
         message_parts = await self._build_user_message_parts(message)

+        message_id = str(uuid.uuid4())
+
         async def stream():
</code_context>

<issue_to_address>
**issue (bug_risk):** 直接丢弃 `message_id` 不匹配的响应,可能导致流式请求出现隐蔽的卡住或只输出部分内容的问题。

如果 `result["message_id"] != message_id`,我们目前只是记录日志并 `continue`。如果后端持续返回错误的 ID(例如 bug、会话路由错误、代理混合了不同的流),客户端会看到连接一直打开但没有任何 token 返回,本质上是无明确错误提示的“卡死”。请考虑把这种情况视为硬错误(中断/终止流)或将错误显式反馈给调用方,而不是悄悄丢弃所有数据。
</issue_to_address>

Sourcery 对开源项目免费使用——如果你觉得我们的代码审查有帮助,欢迎帮忙分享 ✨
帮我变得更有用!请在每条评论上点 👍 或 👎,我会根据你的反馈改进后续的代码审查。
Original comment in English

Hey - I've found 2 issues, and left some high level feedback:

  • In convert_message, falling back to a generated message_id when payload.get("message_id") is missing would keep backward compatibility with older clients and avoid None IDs propagating through the system.
  • In chat.stream, consider lowering the message_id mismatch log to debug or including more context (e.g., both IDs, user/session info) to reduce noisy warnings while still making cross-request mixing issues diagnosable.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In `convert_message`, falling back to a generated `message_id` when `payload.get("message_id")` is missing would keep backward compatibility with older clients and avoid `None` IDs propagating through the system.
- In `chat.stream`, consider lowering the `message_id` mismatch log to debug or including more context (e.g., both IDs, user/session info) to reduce noisy warnings while still making cross-request mixing issues diagnosable.

## Individual Comments

### Comment 1
<location> `astrbot/core/platform/sources/webchat/webchat_adapter.py:200` </location>
<code_context>
         abm.session_id = f"webchat!{username}!{cid}"

-        abm.message_id = str(uuid.uuid4())
+        abm.message_id = payload.get("message_id")

         # 处理消息段列表
</code_context>

<issue_to_address>
**issue (bug_risk):** Relying directly on client-provided `message_id` may introduce `None`/invalid IDs and trust issues.

Previously this was always a server-generated UUID, so downstream code could rely on `message_id` being present, non-empty, and unique. With the client value used directly, it can be missing, `None`, or arbitrary, which may break that assumption and weakens correlation ID trust. Please add basic validation (presence/type/format) and fall back to a server-generated UUID when the client value is absent or invalid.
</issue_to_address>

### Comment 2
<location> `astrbot/dashboard/routes/chat.py:299` </location>
<code_context>
         # 构建用户消息段(包含 path 用于传递给 adapter)
         message_parts = await self._build_user_message_parts(message)

+        message_id = str(uuid.uuid4())
+
         async def stream():
</code_context>

<issue_to_address>
**issue (bug_risk):** Dropping mismatched `message_id` responses might cause subtle stream hangs or partial outputs.

If `result["message_id"] != message_id`, we currently just log and `continue`. If the backend ever consistently returns a wrong ID (e.g., bug, misrouted session, proxy mixing streams), the client will see an open connection but no tokens, effectively a hang with no explicit error. Consider treating this as a hard failure (breaking/aborting the stream) or surfacing an error to the caller instead of silently dropping all data.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

abm.session_id = f"webchat!{username}!{cid}"

abm.message_id = str(uuid.uuid4())
abm.message_id = payload.get("message_id")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): 直接依赖客户端提供的 message_id,可能会引入 None / 无效 ID 以及信任问题。

之前这里始终是由服务端生成的 UUID,因此下游代码可以依赖 message_id 一定存在、非空且唯一。现在直接使用客户端提供的值,它可能缺失、为 None,或者是任意值,这会破坏原有假设并削弱对关联 ID 的信任度。请增加基础校验(存在性 / 类型 / 格式),并在客户端值缺失或无效时回退到服务端生成的 UUID。

Original comment in English

issue (bug_risk): Relying directly on client-provided message_id may introduce None/invalid IDs and trust issues.

Previously this was always a server-generated UUID, so downstream code could rely on message_id being present, non-empty, and unique. With the client value used directly, it can be missing, None, or arbitrary, which may break that assumption and weakens correlation ID trust. Please add basic validation (presence/type/format) and fall back to a server-generated UUID when the client value is absent or invalid.

# 构建用户消息段(包含 path 用于传递给 adapter)
message_parts = await self._build_user_message_parts(message)

message_id = str(uuid.uuid4())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): 直接丢弃 message_id 不匹配的响应,可能导致流式请求出现隐蔽的卡住或只输出部分内容的问题。

如果 result["message_id"] != message_id,我们目前只是记录日志并 continue。如果后端持续返回错误的 ID(例如 bug、会话路由错误、代理混合了不同的流),客户端会看到连接一直打开但没有任何 token 返回,本质上是无明确错误提示的“卡死”。请考虑把这种情况视为硬错误(中断/终止流)或将错误显式反馈给调用方,而不是悄悄丢弃所有数据。

Original comment in English

issue (bug_risk): Dropping mismatched message_id responses might cause subtle stream hangs or partial outputs.

If result["message_id"] != message_id, we currently just log and continue. If the backend ever consistently returns a wrong ID (e.g., bug, misrouted session, proxy mixing streams), the client will see an open connection but no tokens, effectively a hang with no explicit error. Consider treating this as a hard failure (breaking/aborting the stream) or surfacing an error to the caller instead of silently dropping all data.

@dosubot dosubot bot added area:platform The bug / feature is about IM platform adapter, such as QQ, Lark, Telegram, WebChat and so on. feature:chatui The bug / feature is about astrbot's chatui, webchat labels Jan 15, 2026
Soulter added a commit that referenced this pull request Jan 15, 2026
* stage

* fix: update tool call logging to include tool call IDs and enhance sandbox ship creation parameters

* feat: file upload

* fix

* update

* fix: remove 'boxlite' option from booter and handle error in PythonTool execution

* feat: implement singleton pattern for ShipyardSandboxClient and add FileUploadTool for file uploads

* feat: sandbox

* fix

* beta

* uv lock

* remove

* chore: makes world better

* feat: implement localStorage persistence for showReservedPlugins state

* docs: refine EULA

* fix

* feat: add availability check for sandbox in Shipyard and base booters

* feat: add shipyard session configuration options and update related tools

* feat: add file download functionality and update shipyard SDK version

* fix: sending OpenAI-style image_url causes Anthropic 400 invalid tag error (#4444)

* feat: chatui project (#4477)

* feat: chatui-project

* fix: remove console log from getProjects function

* fix: title saving logic and update project sessions on changes

* docs: standardize Context class documentation formatting (#4436)

* docs: standardize Context class documentation formatting

- Unified all method docstrings to standard format
- Fixed mixed language and formatting issues
- Added complete parameter and return descriptions
- Enhanced developer experience for plugin creators
- Fixes #4429

* docs: fix Context class documentation issues per review

- Restored Sphinx directives for versionadded notes
- Fixed MessageSesion typo to MessageSession throughout file
- Added clarification for kwargs propagation in tool_loop_agent
- Unified deprecation marker format
- Fixes #4429

* Convert developer API comments to English

* chore: revise comments

---------

Co-authored-by: Soulter <37870767+Soulter@users.noreply.github.com>

* fix: handle empty output case in PythonTool execution

* fix: update description for command parameter in ExecuteShellTool

* refactor: remove unused file tools and update PythonTool output handling

* project list

* fix: ensure message stream order (#4487)

* feat: enhance iPython tool rendering with Shiki syntax highlighting

* bugfixes

* feat: add sandbox mode prompt for enhanced user guidance in executing commands

* chore: remove skills prompt

---------

Co-authored-by: 時壹 <137363396+KBVsent@users.noreply.github.com>
Co-authored-by: Li-shi-ling <114913764+Li-shi-ling@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:platform The bug / feature is about IM platform adapter, such as QQ, Lark, Telegram, WebChat and so on. feature:chatui The bug / feature is about astrbot's chatui, webchat size:M This PR changes 30-99 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants