-
Notifications
You must be signed in to change notification settings - Fork 6.8k
feature: V4.12.2 #5525
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: V4.12.2 #5525
Conversation
|
There is too much information in the pull request to test. |
|
|
Preview mcp_server Image: |
Preview sandbox Image: |
Docs Preview:🚀 FastGPT Document Preview Ready! |
Preview fastgpt Image: |
|
@cursor review |
| }); | ||
| // Dynamic input is stored in the dynamic key | ||
| if (input.canEdit && dynamicInput && params[dynamicInput.key]) { | ||
| params[dynamicInput.key][input.key] = valueTypeFormat(value, input.valueType); |
Check warning
Code scanning / CodeQL
Prototype-polluting assignment Medium
user controlled input
* chore: extract chat history and drawer; fix model selector * feat: display favourite apps and make it configurable * feat: favorite apps & quick apps with their own configuration * fix: fix tab title and add loading state for searching * fix: cascade delete favorite app and quick app while deleting relative app * chore: make improvements * fix: favourite apps ui * fix: add permission for quick apps * chore: fix permission & clear redundant code
* fix: add empty placeholder; fix app quick status; fix tag and layout * chore: add tab query for the setting tabs * chore: use `useConfirm` hook instead of `MyModal`
* feat: add LLM response processing functions, including the creation of stream-based and complete responses * feat: add volta configuration for node and pnpm versions * refactor: update LLM response handling and event structure in tool choice logic * feat: update LLM response structure and integrate with tool choice logic * refactor: clean up imports and remove unused streamResponse function in chat and toolChoice modules * refactor: rename answer variable to answerBuffer for clarity in LLM response handling * feat: enhance LLM response handling with tool options and integrate tools into chat and tool choice logic * refactor: remove volta configuration from package.json * refactor: reorganize LLM response types and ensure default values for token counts * refactor: streamline LLM response handling by consolidating response structure and removing redundant checks * refactor: enhance LLM response handling by consolidating tool options and streamlining event callbacks * fix: build error * refactor: update tool type definitions for consistency in tool handling
No description provided.