Skip to content

Conversation

@francismiko
Copy link
Contributor

I abstracted the code in toolChoice, promptCall and chat into some common functions and put these reused logic in global/core/ai/request.ts

@cla-assistant
Copy link

cla-assistant bot commented Aug 20, 2025

CLA assistant check
All committers have signed the CLA.

@cla-assistant
Copy link

cla-assistant bot commented Aug 20, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@github-actions
Copy link

github-actions bot commented Aug 20, 2025

Preview mcp_server Image:

registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-pr:fatsgpt_mcp_server_c659dad2f7b21401b751511c6c0b6c2ce2c481b9

@github-actions
Copy link

github-actions bot commented Aug 20, 2025

Preview sandbox Image:

registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-pr:fatsgpt_sandbox_c659dad2f7b21401b751511c6c0b6c2ce2c481b9

@github-actions
Copy link

github-actions bot commented Aug 22, 2025

Preview fastgpt Image:

registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-pr:fatsgpt_c659dad2f7b21401b751511c6c0b6c2ce2c481b9

@francismiko francismiko marked this pull request as ready for review August 22, 2025 04:12
@gru-agent
Copy link
Contributor

gru-agent bot commented Aug 22, 2025

TestGru Assignment

Summary

Link CommitId Status Reason
Detail 5ab92ab 🚫 Skipped No files need to be tested {".husky/pre-commit":"File path does not match include patterns.","packages/global/core/ai/request.ts":"File path does not match include patterns.","packages/global/core/ai/type.d.ts":"File path does not match include patterns.","packages/service/core/workflow/dispatch/ai/agent/promptCall.ts":"File path does not match include patterns.","packages/service/core/workflow/dispatch/ai/agent/toolChoice.ts":"File path does not match include patterns.","packages/service/core/workflow/dispatch/ai/chat.ts":"File path does not match include patterns."}

History Assignment

Tip

You can @gru-agent and leave your feedback. TestGru will make adjustments based on your input

@c121914yu
Copy link
Collaborator

@cursor review

}
}
});
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Streamed Tool Calls Reset Across Chunks

The callingTool variable is declared inside the if (responseChoice?.tool_calls?.length) block, causing it to reset with each stream chunk. This prevents accumulating tool call data across chunks, leading to lost or incomplete tool calls when tool information spans multiple parts of the stream.

Fix in Cursor Fix in Web

/* Run llm */
const {
response: aiResponse,
const write = res ? responseWriteController({ res, readStream: stream }) : undefined;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Incorrect Parameter Type Causes Streaming Failure

It looks like the responseWriteController is receiving a boolean stream flag for its readStream parameter. This parameter expects an actual stream object, so this might prevent streaming responses from working correctly.

Additional Locations (2)

Fix in Cursor Fix in Web

@c121914yu c121914yu changed the base branch from main to v4.12.2-dev August 23, 2025 05:27
@c121914yu c121914yu merged commit 8b47446 into labring:v4.12.2-dev Aug 23, 2025
5 checks passed
c121914yu pushed a commit that referenced this pull request Aug 23, 2025
* feat: add LLM response processing functions, including the creation of stream-based and complete responses

* feat: add volta configuration for node and pnpm versions

* refactor: update LLM response handling and event structure in tool choice logic

* feat: update LLM response structure and integrate with tool choice logic

* refactor: clean up imports and remove unused streamResponse function in chat and toolChoice modules

* refactor: rename answer variable to answerBuffer for clarity in LLM response handling

* feat: enhance LLM response handling with tool options and integrate tools into chat and tool choice logic

* refactor: remove volta configuration from package.json

* refactor: reorganize LLM response types and ensure default values for token counts

* refactor: streamline LLM response handling by consolidating response structure and removing redundant checks

* refactor: enhance LLM response handling by consolidating tool options and streamlining event callbacks

* fix: build error

* refactor: update tool type definitions for consistency in tool handling
c121914yu pushed a commit that referenced this pull request Aug 25, 2025
* feat: add LLM response processing functions, including the creation of stream-based and complete responses

* feat: add volta configuration for node and pnpm versions

* refactor: update LLM response handling and event structure in tool choice logic

* feat: update LLM response structure and integrate with tool choice logic

* refactor: clean up imports and remove unused streamResponse function in chat and toolChoice modules

* refactor: rename answer variable to answerBuffer for clarity in LLM response handling

* feat: enhance LLM response handling with tool options and integrate tools into chat and tool choice logic

* refactor: remove volta configuration from package.json

* refactor: reorganize LLM response types and ensure default values for token counts

* refactor: streamline LLM response handling by consolidating response structure and removing redundant checks

* refactor: enhance LLM response handling by consolidating tool options and streamlining event callbacks

* fix: build error

* refactor: update tool type definitions for consistency in tool handling
c121914yu added a commit that referenced this pull request Aug 25, 2025
* feat: favorite apps & quick apps with their own configuration (#5515)

* chore: extract chat history and drawer; fix model selector

* feat: display favourite apps and make it configurable

* feat: favorite apps & quick apps with their own configuration

* fix: fix tab title and add loading state for searching

* fix: cascade delete favorite app and quick app while deleting relative app

* chore: make improvements

* fix: favourite apps ui

* fix: add permission for quick apps

* chore: fix permission & clear redundant code

* perf: chat home page code

* chatbox ui

* fix: 4.12.2-dev (#5520)

* fix: add empty placeholder; fix app quick status; fix tag and layout

* chore: add tab query for the setting tabs

* chore: use `useConfirm` hook instead of `MyModal`

* remove log

* fix: fix modal padding (#5521)

* perf: manage app

* feat: enhance model provider handling and update icon references (#5493)

* perf: model provider

* sdk package

* refactor: create llm response (#5499)

* feat: add LLM response processing functions, including the creation of stream-based and complete responses

* feat: add volta configuration for node and pnpm versions

* refactor: update LLM response handling and event structure in tool choice logic

* feat: update LLM response structure and integrate with tool choice logic

* refactor: clean up imports and remove unused streamResponse function in chat and toolChoice modules

* refactor: rename answer variable to answerBuffer for clarity in LLM response handling

* feat: enhance LLM response handling with tool options and integrate tools into chat and tool choice logic

* refactor: remove volta configuration from package.json

* refactor: reorganize LLM response types and ensure default values for token counts

* refactor: streamline LLM response handling by consolidating response structure and removing redundant checks

* refactor: enhance LLM response handling by consolidating tool options and streamlining event callbacks

* fix: build error

* refactor: update tool type definitions for consistency in tool handling

* feat: llm request function

* fix: ts

* fix: ts

* fix: ahook ts

* fix: variable name

* update lock

* ts version

* doc

* remove log

* fix: translation type

* perf: workflow status check

* fix: ts

* fix: prompt tool call

* fix: fix missing plugin interact window & make tag draggable (#5527)

* fix: incorrect select quick apps state; filter apps type (#5528)

* fix: usesafe translation

* perf: add quickapp modal

---------

Co-authored-by: 伍闲犬 <[email protected]>
Co-authored-by: Ctrlz <[email protected]>
Co-authored-by: francis <[email protected]>
@francismiko francismiko deleted the francis/invoke-llm branch August 26, 2025 12:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants