-
Notifications
You must be signed in to change notification settings - Fork 191
Support the new workflow for @llamaindex/server #592
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 352d40b The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update migrates the server workflow engine from "@llama-flow/core" to "@llamaindex/workflow", refactors event and workflow handling to align with the new engine, and updates related type definitions, stream processing utilities, and dependencies. It also improves ignore patterns for Python artifacts and virtual environments in development tooling configuration files. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Server
participant WorkflowEngine as "@llamaindex/workflow"
participant DataStream
User->>Server: Sends chat message(s)
Server->>Server: Build chatHistory, validate input
Server->>WorkflowEngine: Create workflow via workflowFactory
Server->>WorkflowEngine: runWorkflow(workflow, input, abortSignal)
WorkflowEngine-->>Server: Emits event stream
Server->>DataStream: toDataStream(eventStream, callbacks)
DataStream-->>User: Streams assistant responses and annotations
User-->>Server: (Disconnect triggers abort)
Poem
📜 Recent review detailsConfiguration used: .coderabbit.yaml ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms (27)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
packages/create-llama/templates/components/workflows/typescript/arifacts/code_workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/arifacts/code_workflow.ts
Outdated
Show resolved
Hide resolved
…nt. Clean up unused imports and improve error handling for missing user input.
…ling for missing user input. Replace workflowInputEvent with startAgentEvent and enhance chat history management. Add callbacks for suggested questions event.
…duce new `code_workflow.ts` for managing code generation and updates, and create a factory function in `workflow.ts`. Modify TypeScript helper to copy all `.ts` files instead of just `workflow.ts`. Update chat handler to utilize `AgentInputData` for improved data handling.
…on, replacing the previous context creation method. This change simplifies the workflow stream initialization and enhances code clarity.
…ce event handling in code_workflow.ts and workflow.ts. Update grammar in enhancedPrompt for clarity and improve response handling in agentStreamEvent.
…mprove memory management. Replace custom event classes with workflowEvent for better clarity and maintainability. Update workflow definition to utilize getWorkflow function, enhancing code organization and readability.
…ents - Introduced `doc-workflow.ts` for managing document generation and updates. - Created `code-workflow.ts` for code artifact management. - Enhanced event handling with `workflowEvent` for better clarity and maintainability. - Updated `README-template.md` to include setup instructions and use cases for new workflows. - Modified `workflow.ts` to allow switching between code and document workflows. - Improved grammar and clarity in prompts and comments throughout the code.
…or improved event handling. Introduce workflowToEngineResponseStream function to streamline the processing of workflow events and enhance error handling. Update return statement in runWorkflow to utilize the new stream implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 13
♻️ Duplicate comments (3)
.changeset/smart-clocks-guess.md (1)
1-6: Duplicate: Reminder to bump versionThis changeset records a
minorbump for@llamaindex/server. Ensure the version inpackages/server/package.jsonis updated accordingly as per your release process..changeset/yummy-moles-lick.md (1)
1-6: Duplicate: Consolidate patch changesetsRefer to the consolidation suggestion in
fluffy-otters-refuse.mdregarding merging patch-level changesets forcreate-llama.packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts (1)
167-171: Same brittle JSON extraction logic as in doc-workflowReuse the improved pattern & error handling suggested earlier to reduce duplicated pitfalls.
🧹 Nitpick comments (9)
.prettierignore (1)
10-15: Align Prettier ignore patterns for Python cachesAs with ESLint, the patterns
**/*.mypy_cache/**,**/*.venv/**, and**/*.ruff_cache/**may not catch the dot-prefixed cache directories (e.g.,.mypy_cache). Update to:- **/*.mypy_cache/** - **/*.venv/** - **/*.ruff_cache/** + **/.mypy_cache/** + **/.venv/** + **/.ruff_cache/**and ensure these entries mirror the ESLint ignore configuration for consistency.
.changeset/fluffy-otters-refuse.md (1)
1-6: Consolidate patch changesets for create-llamaThere are multiple
patchchangesets forcreate-llamain this PR (fluffy-otters-refuse.mdandyummy-moles-lick.md). Consider merging them into a single changeset to simplify version history and changelog generation.packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts (1)
1-9: Simple and effective workflow factory implementation.The workflow factory implementation provides a clean way to switch between different artifact workflows (document and code).
Some suggestions for improvement:
- Consider using a more specific type for
reqBodyinstead ofanyto improve type safety- Standardizing the file naming format across imports would be beneficial (notice "code_workflow" vs "doc-workflow")
- export const workflowFactory = async (reqBody: any) => { + export const workflowFactory = async (reqBody: Record<string, unknown>) => {packages/create-llama/templates/components/workflows/typescript/artifacts/README-template.md (2)
7-9: Specify languages in code blocks for better syntax highlighting.Fenced code blocks should include a language identifier for proper syntax highlighting.
-``` +```bash npm install```diff -``` +```bash npm run dev```diff -``` +```bash npm run generate:uiAlso applies to: 13-15, 27-29 <details> <summary>🧰 Tools</summary> <details> <summary>🪛 markdownlint-cli2 (0.17.2)</summary> 7-7: Fenced code blocks should have a language specified null (MD040, fenced-code-language) </details> </details> --- `40-46`: **Fix grammar and provide clearer example.** There are a few grammatical issues in this section. ```diff - You can start by sending an request on the [chat UI](http://localhost:3000) or you can test the `/api/chat` endpoint with the following curl request: + You can start by sending a request on the [chat UI](http://localhost:3000) or you can test the `/api/chat` endpoint with the following curl request:🧰 Tools
🪛 LanguageTool
[misspelling] ~40-~40: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...(EN_A_VS_AN)
packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts (1)
246-252:docMatchregex misses new-line after the language tag
/```(markdown|html)([\s\S]*)```/consumes the language token but not the required end-of-line, sodocMatch[2]starts with\n, and more importantly it will not match if the LLM forgets the language hint.-const docMatch = response.text.match(/```(markdown|html)([\s\S]*)```/); +const docMatch = response.text.match(/```(?:markdown|html)\s*([\s\S]*?)```/i);Using
?:(non-capturing) plus\s*makes the pattern more tolerant and avoids the super-greedy last-```capture.packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts (1)
272-295: Mismatch between detected language and emitted artifactIf the caller leaves
languageempty, we still know the language from the regex (codeMatch[1]). Use that as a fallback:- language: planData.requirement.language || "", + language: + planData.requirement.language ?? + (codeMatch ? codeMatch[1].toLowerCase() : ""),Prevents blank language fields that downstream UIs may not handle.
packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts (1)
101-103: Allocate the tools array once to avoid unnecessary re-creation
toolsis rebuilt every timeinputEventfires, although its composition never changes. Creating it once when the workflow is instantiated reduces noise and a (tiny) amount of GC pressure.- workflow.handle([inputEvent], async ({ data }) => { - const { sendEvent, state } = getContext(); - const chatHistory = data.input; - const tools = [codeInterpreterTool, documentGeneratorTool, queryEngineTool]; + const tools = [codeInterpreterTool, documentGeneratorTool, queryEngineTool]; + + workflow.handle([inputEvent], async ({ data }) => { + const { sendEvent, state } = getContext(); + const chatHistory = data.input;packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
326-338: Do not pass an already-consumed stream asraw
raw: streamis forwarded after thefor awaitloop has exhausted the iterator, so downstream consumers will receive an unusable/empty stream.
Consider omittingrawor cloning the chunks while iterating so the raw data is still available.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (18)
.changeset/fluffy-otters-refuse.md(1 hunks).changeset/smart-clocks-guess.md(1 hunks).changeset/yummy-moles-lick.md(1 hunks).prettierignore(1 hunks)eslint.config.mjs(1 hunks)packages/create-llama/helpers/typescript.ts(1 hunks)packages/create-llama/questions/simple.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/README-template.md(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/deep_research/README-template.md(1 hunks)packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts(5 hunks)packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts(7 hunks)packages/server/src/events.ts(6 hunks)packages/server/src/handlers/chat.ts(2 hunks)packages/server/src/types.ts(1 hunks)packages/server/src/utils/workflow.ts(4 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (5)
packages/create-llama/helpers/typescript.ts (1)
packages/create-llama/helpers/copy.ts (1)
copy(13-49)
packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts (1)
packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts (1)
createDocumentArtifactWorkflow(60-334)
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts (1)
packages/server/src/events.ts (2)
artifactEvent(120-123)extractLastArtifact(188-215)
packages/server/src/utils/workflow.ts (2)
packages/server/src/types.ts (1)
ServerWorkflow(7-7)packages/server/src/events.ts (1)
sourceEvent(28-28)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
packages/server/src/events.ts (1)
toSourceEvent(59-67)
🪛 LanguageTool
packages/create-llama/templates/components/workflows/typescript/deep_research/README-template.md
[style] ~34-~34: Consider an alternative adjective to strengthen your wording.
Context: ...ings.ts). ## Custom UI Components For Deep Research, we have a custom component lo...
(DEEP_PROFOUND)
packages/create-llama/templates/components/workflows/typescript/artifacts/README-template.md
[uncategorized] ~35-~35: Possible missing comma found.
Context: ... To generate code and display it in the UI like Vercel's v0. - [Document Workflow]...
(AI_HYDRA_LEO_MISSING_COMMA)
[typographical] ~38-~38: It appears that a comma is missing.
Context: ...h artifact workflow to use. Without any changes the Code Workflow is used. You can sta...
(DURING_THAT_TIME_COMMA)
[misspelling] ~40-~40: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...
(EN_A_VS_AN)
🪛 markdownlint-cli2 (0.17.2)
packages/create-llama/templates/components/workflows/typescript/artifacts/README-template.md
7-7: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
13-13: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
27-27: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
🪛 Biome (1.9.4)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
[error] 114-114: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
[error] 116-116: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
🔇 Additional comments (15)
.changeset/fluffy-otters-refuse.md (1)
1-6: Approve new changeset for create-llamaThis changeset correctly marks a
patchupdate forcreate-llama, documenting the addition of an artifact use case for the TypeScript template. No code or public APIs are modified..changeset/yummy-moles-lick.md (1)
1-6: Approve new changeset for create-llamaThis changeset properly documents a patch update for
create-llamareflecting the updated TypeScript use cases using the new workflow engine.packages/create-llama/templates/components/workflows/typescript/deep_research/README-template.md (1)
34-34: Path alignment for UI components looks good.This change updates the file path from
components/deep_research_event.jsxtocomponents/ui_event.jsx, which aligns with the broader refactoring of workflows from class-based to functional, middleware-driven architectures in this PR.🧰 Tools
🪛 LanguageTool
[style] ~34-~34: Consider an alternative adjective to strengthen your wording.
Context: ...ings.ts). ## Custom UI Components For Deep Research, we have a custom component lo...(DEEP_PROFOUND)
packages/create-llama/questions/simple.ts (1)
65-77: Good addition to support TypeScript artifact workflows.Removing the conditional check that previously restricted language selection for artifacts enables users to create TypeScript-based artifact workflows. This change is consistent with the introduction of new TypeScript artifact workflows in this PR.
packages/create-llama/helpers/typescript.ts (1)
34-43: Essential update to support multiple TypeScript workflow files.This change updates the copy pattern from just
workflow.tsto all TypeScript files (*.ts) in the workflow template directory. This is necessary to support the new TypeScript artifact workflows which include multiple files likecode-workflow.ts,doc-workflow.ts, and other supporting files.packages/server/src/types.ts (2)
1-1: Simplified import statement.Removing unused types (
AgentInputDataandAgentWorkflowContext) from the import makes the code cleaner and more maintainable.
7-7:✅ Verification successful
Streamlined ServerWorkflow type definition.
Simplifying the type definition to a straightforward union of
Workflow | AgentWorkflowwithout generic parameters aligns with the refactored workflow system using event factories instead of classes.Since this simplifies the type, please verify that all code using
ServerWorkflowworks correctly with this change, especially in places where specific generic parameters might have been expected.
🏁 Script executed:
#!/bin/bash # Check for usage of ServerWorkflow to ensure compatibility with the simplified type rg -A 3 "ServerWorkflow" --type tsLength of output: 2056
All ServerWorkflow usages verified—no compatibility issues found
Verified that simplifying
ServerWorkflowtoWorkflow | AgentWorkflowdoes not break any existing usage:
- packages/server/src/types.ts: type declaration remains correct.
- packages/server/src/utils/workflow.ts: imported and passed through factory functions without generic assumptions.
- packages/server/src/server.ts: referenced in
LlamaIndexServerOptionswithout expecting type parameters.No further changes required.
packages/server/src/handlers/chat.ts (3)
3-4: Improved type imports.Good update to import specialized types from LlamaIndex for better type safety.
27-33: Well-structured input creation with proper typing.The refactoring to create a structured
workflowInputof typeAgentInputDataimproves code clarity and type safety.
35-37: Clear workflow creation and execution.The workflow instantiation and execution is now more explicit, with clear separation between creating the workflow and running it with the input.
packages/server/src/events.ts (5)
5-5: Good refactoring to factory-based event creation.Adding the
workflowEventimport and using it to create the source event constant improves consistency with the new architecture.Also applies to: 28-28
30-39: Well-structured agent run event data.The agent run event data structure is clear and the factory-created event constant is consistent with the new architecture.
63-66: Clean conversion to factory-based event creation.The
toSourceEventfunction has been updated to use the factory pattern, which is consistent with the overall refactoring.
76-89: Consistent agent run event creation.The
toAgentRunEventfunction now uses the factory pattern consistently with the other event creation functions.
120-123: Well-structured artifact event definition.The artifact event is now defined using the factory pattern, which is consistent with the other events in the file.
packages/create-llama/templates/components/workflows/typescript/artifacts/README-template.md
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
Outdated
Show resolved
Hide resolved
…eamline event processing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
114-116: Replace{}withvoidfor empty payload typesThe empty object type
{}should be avoided as it matches any non-null value. Usevoidto better indicate events with no payload data.-const planResearchEvent = workflowEvent<{}>(); -const reportEvent = workflowEvent<{}>(); +const planResearchEvent = workflowEvent<void>(); +const reportEvent = workflowEvent<void>();🧰 Tools
🪛 Biome (1.9.4)
[error] 114-114: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
[error] 116-116: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
🧹 Nitpick comments (1)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
161-192: Effective implementation of the start agent event handlerThe handler properly initializes the workflow state, performs document retrieval, emits UI events, and sends source events using
toSourceEvent. This follows good practices for asynchronous workflow management.- state.memory.set(chatHistory); + await state.memory.set(chatHistory);Consider making the memory set operation awaited for consistency with other async operations, since ChatMemoryBuffer's set method might be asynchronous.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts(5 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
packages/server/src/events.ts (1)
toSourceEvent(59-67)
🪛 Biome (1.9.4)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
[error] 114-114: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
[error] 116-116: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
⏰ Context from checks skipped due to timeout of 90000ms (27)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files)
🔇 Additional comments (7)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (7)
147-159: Good use of stateful middleware patternThe stateful middleware pattern with
createStatefulMiddlewareis a clean approach that separates state management from workflow logic. The state initialization is well structured with all necessary workflow state properties.
181-181: Good integration with source event systemThe use of
toSourceEventproperly converts retrieved nodes into a standardized event format, which aligns well with the server's event system.
234-254: Watch for potential double counting of questionsThe current implementation increments
totalQuestionsbefore research starts, but doesn't increment it again after completion. This avoids the double counting issue mentioned in previous reviews.
194-267: Elegant handling of plan research logicThe plan research handler effectively:
- Notifies the UI of analysis progress
- Creates a research plan based on context and memory
- Handles the different decision paths (cancel, research, report)
- Properly uses the stream.until pattern to wait for research results
269-308: Clean asynchronous handling of research questionsThe research event handler properly:
- Updates UI state for the specific question being researched
- Calls the answer question function with the necessary context
- Updates memory and research results with the answer
- Sends a UI event with the complete answer data
310-349: Effective streaming implementation for report generationThe report event handler correctly:
- Builds appropriate messages for the LLM using memory and the report prompt
- Streams the response in chunks, updating the UI with each chunk
- Properly finalizes the workflow with the stop agent event
354-420: Well-structured helper functions at the end of the fileThe helper functions are properly:
- Isolated from the main workflow logic
- Organized at the end of the file as suggested in previous comments
- Implemented with clear input/output contracts and minimal side effects
- Making appropriate use of the LLM and prompt templates
…usage and error handling. Update toDataStreamResponse function to toDataStream and enhance callback functionality for better stream management in workflow.ts.
8d3053d to
df62ed0
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Nitpick comments (4)
packages/server/src/utils/stream.ts (2)
45-53: UnboundedfullContentaccumulation can exhaust memory on long streams
fullContent += contentkeeps a full copy of everything streamed. For large agent-responses this may blow up the heap, especially on serverless platforms.If the sole purpose is to build the last assistant message for suggested-question generation, consider:
- Truncating to a reasonable max length, or
- Collecting only the final chunk if the agent guarantees the full answer is sent in one
stopAgentEvent, or- Streaming the content into a temporary buffer and releasing it right after
onCompletion.
58-63: Error path skips user callback whenonErroritself throwsIf
userCallbacks?.onErrorthrows, the subsequentdataStream.writeData(errorMessage)is skipped and the promise rejects, surfacing as “Unhandled error in stream”.
Wrap the user callback in a try/catch to guarantee we still emit an error chunk.packages/server/src/utils/workflow.ts (2)
60-63:until(stopAgentEvent)may swallow the terminating event
untildrops the element that satisfies the predicate. If callers expect to receive the finalstopAgentEvent(e.g. to show typing-indicator → stopped), it will never reach them. ConsidertakeUntilInclusiveor enqueue the event manually after the transformation.
92-95:downloadLlamaCloudFilesFromNodesfire-and-forget loses error visibilityFailures inside
downloadFileare swallowed, making debugging difficult.
Either:
await downloadLlamaCloudFilesFromNodes(...)so the transform’scatchhandles errors, or- Wrap the call in its own
try/catchand enqueue an error annotation.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/server/src/utils/request.ts(1 hunks)packages/server/src/utils/stream.ts(1 hunks)packages/server/src/utils/workflow.ts(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils/workflow.ts (4)
packages/server/src/types.ts (1)
ServerWorkflow(7-7)packages/server/src/utils/stream.ts (1)
toDataStream(32-72)packages/server/src/utils/suggestion.ts (1)
sendSuggestedQuestionsEvent(18-29)packages/server/src/events.ts (3)
toAgentRunEvent(65-86)toSourceEvent(55-63)sourceEvent(24-24)
🔇 Additional comments (1)
packages/server/src/utils/workflow.ts (1)
35-39:❓ Verification inconclusive
chatHistoryfield may be silently dropped bystartAgentEventPrevious discussions (see earlier PR comments) noted that
startAgentEvent’s schema only accepts{ userInput }. Passing an extrachatHistoryproperty compiles but is discarded at runtime, so downstream tools never see the history.Double-check the latest
@llamaindex/workflowtypings; ifchatHistoryis still unsupported, remove it and transmit history via a dedicated custom event or extend the event upstream.
🏁 Script executed:
#!/bin/bash set -e # Find all occurrences of startAgentEvent and its imports rg "startAgentEvent" -n . rg "import.*startAgentEvent" -n .Length of output: 1310
🏁 Script executed:
#!/bin/bash set -e echo "Attempting to fetch type definitions for @llamaindex/workflow from unpkg..." # List of potential type definition file paths in the package files=( "index.d.ts" "dist/index.d.ts" "dist/esm/index.d.ts" "esm/index.d.ts" "types/index.d.ts" ) for file in "${files[@]}"; do url="https://unpkg.com/@llamaindex/workflow@latest/$file" status=$(curl -s -o /dev/null -w "%{http_code}" "$url") echo "Checking $file: HTTP $status" if [ "$status" -eq 200 ]; then echo "=== Contents of $file (first 200 lines) ===" curl -s "$url" | sed -n '1,200p' exit 0 fi done echo "No type definition file found in known locations." exit 1Length of output: 1952
🏁 Script executed:
#!/bin/bash set -e PKG="@llamaindex/workflow" echo "Fetching package metadata for $PKG..." meta=$(curl -s "https://registry.npmjs.org/${PKG}") latest=$(echo "$meta" | jq -r '."dist-tags".latest') echo "Latest version: $latest" tarball=$(echo "$meta" | jq -r ".versions[\"$latest\"].dist.tarball") echo "Tarball URL: $tarball" echo "Downloading and extracting tarball..." mkdir -p /tmp/pkg curl -sL "$tarball" | tar -xz -C /tmp/pkg echo "Listing TypeScript declaration files:" find /tmp/pkg/package -type f -name "*.d.ts" echo "Searching for startAgentEvent definitions:" grep -R "startAgentEvent" -n /tmp/pkg/packageLength of output: 117769
🏁 Script executed:
#!/bin/bash set -e # Fetch tarball URL for version 1.1.0 meta=$(curl -s https://registry.npmjs.org/@llamaindex/workflow) tarball=$(echo "$meta" | jq -r '.versions["1.1.0"].dist.tarball') # Download and extract mkdir -p /tmp/pkg curl -sL "$tarball" | tar -xz -C /tmp/pkg # List declaration files echo "=== .d.ts files in package ===" find /tmp/pkg/package -type f -name "*.d.ts" # Search for startAgentEvent definition echo "=== Occurrences of startAgentEvent ===" grep -R -n "startAgentEvent" /tmp/pkg/package || echo "No occurrences found"Length of output: 117637
Verify chatHistory support in startAgentEvent API
We attempted to locate a
startAgentEventdefinition in the@llamaindex/[email protected]package (including its TypeScript declarations) but found no mention of achatHistoryfield. IfstartAgentEventstill only accepts{ userInput }at runtime, the extrachatHistoryproperty will be discarded.Please manually confirm:
- The shape of
startAgentEventin@llamaindex/[email protected](types and runtime schema).- Whether the
chatHistoryproperty is dropped at runtime.If it is unsupported:
- Remove
chatHistoryfrom the.with()call inpackages/server/src/utils/workflow.ts.- Transmit the history via a dedicated custom event or extend the workflow event schema upstream.
… usage and improve error handling. Update toDataStream function in stream.ts for better data streaming and processing. Enhance imports from @llamaindex/workflow for consistency.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
packages/server/src/utils/stream.ts (1)
25-47:⚠️ Potential issueStream never explicitly closes – downstream consumers may wait indefinitely
The
toDataStreamfunction doesn't explicitly close the stream after processing completes. This could lead to hanging HTTP responses if consumers wait for a closure signal.Add an explicit close after processing the stream:
execute: async (dataStreamWriter: DataStreamWriter) => { for await (const event of stream) { if (agentStreamEvent.include(event) && event.data.delta) { const content = event.data.delta; if (content) { dataStreamWriter.write(formatDataStreamPart("text", content)); } } else { dataStreamWriter.writeMessageAnnotation(event.data as JSONValue); } } + // Explicitly close the stream to prevent hanging responses + dataStreamWriter.close(); },
🧹 Nitpick comments (4)
packages/server/src/utils/stream.ts (2)
9-18: Unused interface in current implementationThe
StreamCallbacksinterface is defined but not used in thetoDataStreamfunction below. Consider either:
- Implementing these callbacks in
toDataStreamto provide better control for consumers, or- Removing the unused interface if it's not needed for future implementations
36-38: Handle empty arrays or non-object annotationsThe code assumes
event.datais always a valid JSON value when writing message annotations. Consider adding validation to prevent potential runtime errors.} else { - dataStreamWriter.writeMessageAnnotation(event.data as JSONValue); + // Ensure event.data is a valid non-null object before writing annotation + if (event.data && typeof event.data === 'object') { + dataStreamWriter.writeMessageAnnotation(event.data as JSONValue); + } }packages/server/src/handlers/chat.ts (2)
40-48: Set content type header for streaming responseWhen streaming responses, it's important to set the appropriate content type header. While
pipeStreamToResponsehandles the mechanics of streaming, it doesn't set headers.const workflow = await workflowFactory(body); const workflowEventStream = await runWorkflow( workflow, workflowInput, abortController.signal, ); + // Set appropriate content type for streaming response + res.setHeader('Content-Type', 'text/event-stream'); + res.setHeader('Cache-Control', 'no-cache'); + res.setHeader('Connection', 'keep-alive'); const dataStream = toDataStream(workflowEventStream); pipeStreamToResponse(res, dataStream);
47-48: Handle potential errors from stream conversion and pipingThe code doesn't handle potential errors that might occur during stream conversion or piping. Consider adding error handling for these operations.
- const dataStream = toDataStream(workflowEventStream); - pipeStreamToResponse(res, dataStream); + try { + const dataStream = toDataStream(workflowEventStream); + await pipeStreamToResponse(res, dataStream); + } catch (streamError) { + console.error("Streaming error:", streamError); + // Only send error response if headers haven't been sent yet + if (!res.headersSent) { + return sendJSONResponse(res, 500, { + detail: (streamError as Error).message || "Error streaming response", + }); + } + }
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
packages/server/src/handlers/chat.ts(2 hunks)packages/server/src/server.ts(2 hunks)packages/server/src/types.ts(1 hunks)packages/server/src/utils/stream.ts(1 hunks)packages/server/src/utils/workflow.ts(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/server/src/server.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/server/src/types.ts
- packages/server/src/utils/workflow.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/handlers/chat.ts (4)
packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts (1)
workflowFactory(4-9)packages/server/src/utils/workflow.ts (1)
runWorkflow(25-44)packages/server/src/utils/stream.ts (1)
toDataStream(25-47)packages/server/src/utils/request.ts (1)
pipeStreamToResponse(30-41)
🔇 Additional comments (3)
packages/server/src/handlers/chat.ts (3)
29-35: Properly structured workflow inputThe refactoring to use a defined
AgentInputDatainterface improves code clarity and type safety. The mapping of messages to the required format is clean and efficient.
37-39: Excellent addition of abort handlingAdding abort controller support ensures resources are properly released when the client disconnects, preventing server-side process hanging.
50-50: Improved error loggingAdding the "Chat handler error:" prefix to error logs makes it easier to identify the source of errors in logs.
…factor toDataStream in stream.ts to support callback options for onStart, onText, and onFinal events. Export generateNextQuestions function in suggestion.ts for improved accessibility.
…ates to enhance consistency and organization. Update package.json to include @llamaindex/workflow version 1.1.0. Remove commented-out code in gen-ui.ts for cleaner implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (8)
packages/server/src/utils/stream.ts (2)
35-40: Consider using a more specific type for the stream parameterThe use of
AsyncIterable<WorkflowEventData<unknown>>is correct but could possibly be defined more specifically if the exact event data types are known.- stream: AsyncIterable<WorkflowEventData<unknown>>, + stream: AsyncIterable<WorkflowEventData<unknown>>,
69-72: 🛠️ Refactor suggestionEnsure stream is explicitly closed after onFinal callback
The stream may not be properly closed if the
onFinalcallback itself writes to the stream, which could cause hanging HTTP responses. Consider adding an explicit close call after the callback completes.// Call onFinal with the complete text when stream ends if (callbacks?.onFinal) { await callbacks.onFinal(completionText, dataStreamWriter); + // Ensure stream is properly closed after all callbacks complete + dataStreamWriter.close(); }packages/server/src/utils/workflow.ts (3)
57-63:⚠️ Potential issuePotential security issue with tool inputs exposure
Directly stringifying and including
toolKwargsin the log may expose sensitive information like API keys or personal data that might be passed to tools.- const inputString = JSON.stringify(event.data.toolKwargs); - transformedEvent = toAgentRunEvent({ - agent: event.data.agentName, - text: `Using tool: '${event.data.toolName}' with inputs: '${inputString}'`, - type: "text", - }); + // Sanitize and limit tool input exposure + const argKeys = Object.keys(event.data.toolKwargs).join(", "); + transformedEvent = toAgentRunEvent({ + agent: event.data.agentName, + text: `Using tool: '${event.data.toolName}' with arguments: ${argKeys}`, + type: "text", + });
65-76: 🛠️ Refactor suggestionUnsafe type casting bypasses type safety
The double type assertion
as unknown as NodeWithScore<Metadata>[]bypasses TypeScript's type safety mechanisms. Consider using proper validation before casting.- const sourceNodes = - rawOutput.sourceNodes as unknown as NodeWithScore<Metadata>[]; - transformedEvent = toSourceEvent(sourceNodes); + // Validate the sourceNodes structure before casting + if (Array.isArray(rawOutput.sourceNodes) && + rawOutput.sourceNodes.every(node => + node && typeof node === 'object' && + 'node' in node && 'score' in node && 'metadata' in node)) { + const sourceNodes = rawOutput.sourceNodes as NodeWithScore<Metadata>[]; + transformedEvent = toSourceEvent(sourceNodes); + } else { + console.warn("Invalid sourceNodes format in AgentToolCallResult"); + }
52-85: 🛠️ Refactor suggestionAdd error handling to the transform function
Missing error handling in the transform function could cause the entire stream to terminate if an error occurs during transformation.
async transform(event, controller) { + try { console.log("event", event.data); let transformedEvent = event; // Handle agent events from AgentToolCall if (agentToolCallEvent.include(event)) { const inputString = JSON.stringify(event.data.toolKwargs); transformedEvent = toAgentRunEvent({ agent: event.data.agentName, text: `Using tool: '${event.data.toolName}' with inputs: '${inputString}'`, type: "text", }); } // Handle source nodes from AgentToolCallResult else if (agentToolCallResultEvent.include(event)) { const rawOutput = event.data.raw; if ( rawOutput && typeof rawOutput === "object" && "sourceNodes" in rawOutput // TODO: better use Zod to validate and extract sourceNodes from toolCallResult ) { const sourceNodes = rawOutput.sourceNodes as unknown as NodeWithScore<Metadata>[]; transformedEvent = toSourceEvent(sourceNodes); } } // Post-process for llama-cloud files if (sourceEvent.include(transformedEvent)) { const sourceNodesForDownload = transformedEvent.data.data.nodes; // These are SourceEventNode[] downloadLlamaCloudFilesFromNodes(sourceNodesForDownload); // download files in background } controller.enqueue(transformedEvent); + } catch (error) { + console.error("Error transforming workflow event:", error); + // Enqueue the original event to ensure stream continues + controller.enqueue(event); + } },packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts (2)
107-123: 🛠️ Refactor suggestionAdd fallback for non-streaming LLM responses
If the LLM does not support streaming (
responseGeneratoris undefined), the accumulatedresponsestring stays empty, resulting in an empty answer for the user even thoughtoolCallResponse.responseis available.const generator = toolCallResponse.responseGenerator; let response = ""; if (generator) { for await (const chunk of generator) { response += chunk.delta; sendEvent( agentStreamEvent.with({ delta: chunk.delta, response, currentAgentName: "LLM", // Or derive from context if needed raw: chunk.raw, }), ); } + } else if (toolCallResponse.response) { + // Fallback for non-streaming models + response = toolCallResponse.response; + sendEvent( + agentStreamEvent.with({ + delta: response, + response, + currentAgentName: "LLM", + raw: null, + }), + ); } return stopAgentEvent.with({ result: response });
264-283: 🛠️ Refactor suggestionAdd error handling for tool calls
Tool calls can fail due to external factors like network issues, quota limits, or runtime errors. Without try/catch, an unhandled exception will terminate the entire workflow and appear as a 500 error to the user.
// Call the code interpreter tools if needed if (toolCalls.length > 0) { + try { const toolMsgs = await callTools({ tools: [codeInterpreterTool], toolCalls, writeEvent: (text, step) => { sendEvent( toAgentRunEvent({ agent: "Analyst", text, type: toolCalls.length > 1 ? "progress" : "text", current: step, total: toolCalls.length, }), ); }, }); for (const toolMsg of toolMsgs) { state.memory.put(toolMsg); } + } catch (error) { + sendEvent( + agentStreamEvent.with({ + delta: String(error), + response: String(error), + currentAgentName: "Analyst", + raw: null, + }), + ); + // Add error message to memory and continue the workflow + state.memory.put({ + role: "assistant", + content: `Error during analysis: ${String(error)}`, + }); + } }packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
116-118: Replace{}withvoidfor empty payload typesThe use of
{}as a type is flagged by Biome because it matches any non-null/undefined value. Usingvoidbetter communicates that these events carry no data and will satisfy the linter.-const planResearchEvent = workflowEvent<{}>(); +const planResearchEvent = workflowEvent<void>(); const researchEvent = workflowEvent<ResearchQuestion>(); -const reportEvent = workflowEvent<{}>(); +const reportEvent = workflowEvent<void>();🧰 Tools
🪛 Biome (1.9.4)
[error] 116-116: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
[error] 118-118: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
🧹 Nitpick comments (1)
packages/server/src/utils/workflow.ts (1)
53-53: Remove debug logging from production codeThe
console.logstatement should be removed or wrapped in a debug flag condition to avoid noisy logs in production.- console.log("event", event.data); + // Only log in development/debug mode + if (process.env.NODE_ENV === 'development') { + console.log("event", event.data); + }
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts(5 hunks)packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts(7 hunks)packages/server/package.json(1 hunks)packages/server/src/handlers/chat.ts(2 hunks)packages/server/src/utils/gen-ui.ts(2 hunks)packages/server/src/utils/stream.ts(1 hunks)packages/server/src/utils/suggestion.ts(2 hunks)packages/server/src/utils/workflow.ts(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/server/package.json
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/server/src/utils/gen-ui.ts
- packages/server/src/utils/suggestion.ts
🧰 Additional context used
🧬 Code Graph Analysis (2)
packages/server/src/utils/workflow.ts (1)
packages/server/src/events.ts (3)
toAgentRunEvent(65-86)toSourceEvent(55-63)sourceEvent(24-24)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (1)
packages/server/src/events.ts (1)
toSourceEvent(55-63)
🪛 Biome (1.9.4)
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts
[error] 116-116: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
[error] 118-118: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
🔇 Additional comments (17)
packages/server/src/handlers/chat.ts (6)
23-26: Type mapping ensures proper message format conversionThe mapping creates a properly typed
chatHistoryarray from Vercel AI'sMessagetype to LlamaIndex'sMessageType, ensuring compatibility with the workflow API.
34-37: Well-structured workflow input object with chat historyThe input object correctly includes both the user's immediate input and the full chat history, providing context for the workflow processing.
39-40: Good resource management with AbortControllerThe implementation properly handles client disconnections using an AbortController, which prevents server resources from being wasted if the client disconnects during a long-running workflow.
43-47: Clean workflow execution with proper parameter passingThe handler correctly awaits the workflow factory result and executes it with appropriate inputs and abort signal. The refactored code is more concise and easier to understand than the previous implementation.
49-59: Effective use of streaming callbacksThe implementation effectively uses the new
toDataStreamutility with lifecycle callbacks, particularly theonFinalcallback that updates chat history and triggers suggested questions when the completion finishes.
62-62: Improved error logging with contextAdding "Chat handler error:" prefix to error logs helps with identifying the source of errors in logs, making debugging easier.
packages/server/src/utils/stream.ts (2)
12-27: Well-documented callback interface for stream lifecycleThe
StreamCallbacksinterface provides clear, well-documented callbacks for important stream lifecycle events, making it easier for consumers to hook into different parts of the streaming process.
74-78: Good error handling with informative messagesThe error handling distinguishes between known Error instances and unknown errors, providing informative error messages for stream consumers.
packages/server/src/utils/workflow.ts (1)
30-44: Clean workflow execution with proper validationThe refactored
runWorkflowfunction correctly validates user input and uses the newrunfunction with appropriate events. The use ofuntilwith the abort signal is a clean way to handle early termination.packages/create-llama/templates/components/workflows/typescript/financial_report/workflow.ts (3)
55-59: Clean event definitions using workflowEvent factoryThe refactored code effectively uses the
workflowEventfactory function to define typed events, which is cleaner and more maintainable than class-based event definitions.
73-79: Good validation of LLM capabilitiesThe code properly validates that the LLM supports tool calls before proceeding, which prevents runtime errors when attempting to use tool functionality.
81-97: Effective middleware-based state managementThe refactored code uses middleware-based state management for the workflow, which provides a clean way to manage chat memory and other state across event handlers.
packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.ts (5)
27-27: LGTM: Clean workflow factory implementationThe workflow factory has been simplified to use the
getWorkflowfunction, which aligns well with the functional approach of the refactored codebase.
147-354: Function-based workflow implementation looks goodThe refactoring from a class-based to a function-based implementation using middleware and event handlers follows best practices for the new workflow engine. The code is well-structured with clear separation of concerns.
183-183: Good use of the toSourceEvent utilityUsing the
toSourceEventutility from@llamaindex/serverto transform retrieved nodes is a nice integration with the server package, supporting the PR objective.
356-422: Good organization of helper functionsExtracting helper functions and placing them at the end of the file improves readability and organization, as previously suggested in review comments.
236-257:✅ Verification successful
Check incrementation of totalQuestions
Make sure that
totalQuestionsis only incremented once. Previous review comments indicated it was being incremented twice, though I only see one increment at line 237 in the current implementation.
🏁 Script executed:
#!/bin/bash # Check if totalQuestions is incremented elsewhere in the file rg "totalQuestions \+=" packages/create-llama/templates/components/workflows/typescript/deep_research/workflow.tsLength of output: 171
No duplicate
totalQuestionsincrement found – no action neededVerification via ripgrep confirms that
state.totalQuestionsis only incremented once at line 237 inworkflow.ts. There are no other occurrences of+= totalQuestionsin this file, so the counter isn’t being incremented twice.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/create-llama/templates/components/agents/typescript/financial_report/workflow/fin-report.ts(1 hunks)packages/create-llama/templates/components/agents/typescript/form_filling/workflow/form-filling.ts(1 hunks)
Summary by CodeRabbit
New Features
Refactor
Chores