-
Notifications
You must be signed in to change notification settings - Fork 191
feat: support human in the loop for TS #686
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 5b06106 The changes in this PR will be included in the next version bump. This PR includes changesets to release 4 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update introduces human-in-the-loop (HITL) workflow capabilities to the server package. It adds event schemas, snapshot/resume logic, and stream handling for human input, alongside a CLI-based workflow example. Core workflow and streaming utilities are refactored to support HITL events, and new exports are provided for HITL-related modules and helpers. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API
participant Workflow
participant HITL Utils
participant Human (UI)
participant Snapshot Store
Client->>API: Send chat request (may include id/human responses)
API->>Workflow: runWorkflow({ workflow, input, human })
alt Resume from snapshot
Workflow->>HITL Utils: resumeWorkflowFromHumanResponses
HITL Utils->>Snapshot Store: loadSnapshot(requestId)
HITL Utils-->>Workflow: Resume context with human responses
else New workflow
Workflow-->>API: Create new context, start event
end
Workflow-->>API: Return workflow context/stream
API->>API: processWorkflowStream
loop Stream events
API->>Client: Stream assistant responses/events
alt HITL event (pause for input)
API->>HITL Utils: pauseForHumanInput(context, responseEvent, requestId)
HITL Utils->>Snapshot Store: saveSnapshot(requestId, snapshot)
API->>Human (UI): Request human input
Human (UI)->>API: Send human response
API->>Workflow: Resume workflow with human response
end
end
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
🧹 Nitpick comments (3)
packages/server/src/utils/stream.ts (1)
29-30: Documentation improvement suggestionThe comment has extra spaces that should be cleaned up for consistency.
- /** `onPauseForHumanInput`: Called when human input event is emitted. */ + /** `onPauseForHumanInput`: Called when human input event is emitted. */packages/server/src/utils/hitl.ts (2)
1-1: Consider removing the ESLint disable comment.The blanket disable of
@typescript-eslint/no-explicit-anysuggests heavy usage ofanytypes. Consider using more specific types or generics where possible to maintain type safety.-/* eslint-disable @typescript-eslint/no-explicit-any */
18-22: Address the TODO for file-based persistence.The in-memory storage is not suitable for production use. Consider implementing file-based or database persistence for workflow snapshots.
Would you like me to generate an implementation for file-based snapshot persistence using the filesystem or a database solution?
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
packages/server/src/handlers/chat.ts(3 hunks)packages/server/src/utils/hitl.ts(1 hunks)packages/server/src/utils/index.ts(1 hunks)packages/server/src/utils/stream.ts(3 hunks)packages/server/src/utils/workflow.ts(2 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
packages/server/src/utils/stream.ts (1)
packages/server/src/utils/hitl.ts (2)
HumanInputEventData(31-34)humanInputEvent(39-39)
packages/server/src/handlers/chat.ts (3)
packages/server/src/utils/hitl.ts (3)
getHumanResponseFromMessage(58-68)createWorkflowContextFromHumanResponse(70-96)pauseForHumanInput(98-113)packages/server/src/utils/workflow.ts (1)
runWorkflow(26-46)packages/server/src/utils/stream.ts (1)
toDataStream(39-90)
packages/server/src/utils/hitl.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message(6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: lint
🔇 Additional comments (6)
packages/server/src/utils/index.ts (1)
4-4: LGTM!The export addition follows the established pattern and correctly exposes the HITL utilities.
packages/server/src/utils/stream.ts (1)
68-73: LGTM! Stream handling for human input events is well implemented.The implementation correctly:
- Invokes the callback when a human input event is detected
- Writes the event data as a message annotation for persistence
- Breaks the stream to pause processing, which is essential for HITL workflows
packages/server/src/handlers/chat.ts (1)
27-27:❓ Verification inconclusive
Address TODO and add validation for requestId
The TODO comment indicates this implementation might be incomplete. Additionally, the requestId is extracted without validation, which could cause issues if it's missing or not a string.
- const requestId = req.headers["x-request-id"] as string; // TODO: update for chat route also + const requestId = req.headers["x-request-id"]; + if (!requestId || typeof requestId !== "string") { + return sendJSONResponse(res, 400, { + error: "Missing or invalid x-request-id header", + }); + }Can you clarify what needs to be updated for the chat route as mentioned in the TODO?
🏁 Script executed:
#!/bin/bash set -euxo pipefail # Locate and print the beginning of the chat handler FILE=$(fd chat.ts) echo "Found chat handler at: $FILE" echo "---- Chat Handler (lines 1–200) ----" sed -n '1,200p' "$FILE" # Show occurrences of x-request-id in all handlers for comparison echo echo "---- x-request-id usages in handlers ----" rg -n -C2 "x-request-id" packages/server/src/handlers # Show patterns of requestId extraction across handlers echo echo "---- requestId extraction in handlers ----" rg -n -C2 "const requestId" packages/server/src/handlersLength of output: 2032
#!/bin/bash set -euxo pipefail # Locate the specific chat handler in the server package echo "Searching for server chat handler file..." find packages/server/src/handlers -type f -name chat.ts # Display the first 200 lines of the handler to inspect the TODO and header extraction FILE="packages/server/src/handlers/chat.ts" echo "---- $FILE (lines 1–200) ----" sed -n '1,200p' "$FILE" # Show occurrences of x-request-id and requestId usage in this file echo echo "---- x-request-id usage in $FILE ----" rg -n -C2 "x-request-id" "$FILE" echo echo "---- requestId extraction in $FILE ----" rg -n -C2 "const requestId" "$FILE"
#!/bin/bash set -euxo pipefail # Verify chat handler exists FILE="packages/server/src/handlers/chat.ts" echo "Exists: $( [ -f "$FILE" ] && echo true || echo false )" # Show header extraction and TODO around line 27 echo "---- $FILE (lines 1–60) ----" sed -n '1,60p' "$FILE" echo echo "---- $FILE (lines 20–40) ----" sed -n '20,40p' "$FILE" # Show x-request-id usage in chat handler echo echo "---- x-request-id in chat.ts ----" rg -n -C2 "x-request-id" "$FILE" # Compare requestId extraction in other handlers echo echo "---- requestId in other handlers ----" rg -n -C2 "const requestId" packages/server/src/handlers
#!/bin/bash set -euxo pipefail # Path to the chat handler in the server package FILE="packages/server/src/handlers/chat.ts" if [ ! -f "$FILE" ]; then echo "Error: $FILE not found" >&2 exit 1 fi # Show the start of the file including the TODO echo "---- $FILE (lines 1–60) ----" sed -n '1,60p' "$FILE" # Narrow in on the TODO comment area around line 27 echo echo "---- $FILE (lines 20–40) ----" sed -n '20,40p' "$FILE" # Show any x-request-id header references in this file echo echo "---- x-request-id usage in $FILE ----" rg -n -C2 "x-request-id" "$FILE" || echo "(none found)" # Compare how other handlers extract requestId echo echo "---- requestId extraction in all server handlers ----" rg -n -C2 "const requestId" packages/server/src/handlers || echo "(none found)"
Unable to locate
chat.tsTODO – please verifyI couldn’t find or inspect the
// TODO: update for chat route alsoline inpackages/server/src/handlers/chat.ts. Could you:
- Confirm that the file path is correct
- Share lines 20–40 (or surrounding the TODO) from that file
With that snippet I can verify the existing pattern in other handlers and clarify what “update for chat route” entails.
packages/server/src/utils/workflow.ts (1)
34-40: Approve the workflow execution refactoring.The change from calling
workflow.run()to usingcontext.sendEvent()andcontext.streamis correct and aligns with the HITL workflow pattern where contexts can be resumed from snapshots.packages/server/src/utils/hitl.ts (2)
12-16: Approve the type inference approach.Using
ReturnTypeto infer snapshot types from thewithSnapshotfunction is a clever approach when the library doesn't export these types directly.
31-39: Approve the event type definitions.The
HumanInputEventDatastructure andhumanInputEventdefinition are well-designed for extensibility.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
♻️ Duplicate comments (1)
packages/server/src/utils/workflow.ts (1)
30-41: Fix incorrectrunWorkflowcalls to use the new signature.The function signature has been changed to accept an object parameter, but some call sites may still be using the old signature.
🧹 Nitpick comments (4)
packages/server/examples/hitl/README.md (1)
1-1: Improve phrasing & article usage in README headlineThe opening sentence is grammatically off and slightly vague. A concise alternative would be:
-This example demonstrates human in the loop workflow. +This example demonstrates **a human-in-the-loop workflow**.This small tweak clarifies intent and reads more naturally.
🧰 Tools
🪛 LanguageTool
[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...(AI_EN_LECTOR_MISSING_DETERMINER_THE)
packages/server/src/utils/hitl/pause.ts (1)
15-19: Consider returning the generatedrequestId
pauseForHumanInputgenerates or receives arequestId, persists a snapshot, but discards the ID. Returning it lets callers persist or surface the identifier to clients without guessing.- await saveSnapshot(requestId, snapshotData); + await saveSnapshot(requestId, snapshotData); + + // Expose the identifier so the caller / client can resume later + return requestId;This is a backwards-compatible, single-line change that improves ergonomics.
packages/server/examples/hitl/components/cli_human_input.tsx (1)
75-80:disabledinput cannot be edited – removeonChange/ stateThe field is rendered with
disabled, soonChangewill never fire.editableCommandstate is redundant unless you intend to make the field editable later.- <input - disabled - type="text" - value={editableCommand || ""} - onChange={(e) => setEditableCommand(e.target.value)} + <input + disabled + type="text" + value={inputEvent.command} className="my-2 w-full overflow-x-auto rounded border border-gray-300 bg-gray-100 p-3 font-mono text-xs text-gray-800" />packages/server/src/utils/hitl/events.ts (1)
5-8: Consider using more specific types instead ofz.any().Using
z.any()for the data field removes type safety. Consider using a discriminated union or generic types for better type safety.Example approach using discriminated unions:
-export const humanInputEventSchema = z.object({ - type: z.string(), // An identifier for the input component in UI - data: z.any(), // The data to be sent to the input component in UI -}); +export const humanInputEventSchema = z.discriminatedUnion("type", [ + z.object({ + type: z.literal("cli_human_input"), + data: z.object({ + command: z.string(), + }), + }), + // Add other input types as needed + z.object({ + type: z.string(), + data: z.unknown(), // Fallback for unknown types + }), +]);
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (17)
.changeset/eight-moons-perform.md(1 hunks)packages/server/examples/hitl/README.md(1 hunks)packages/server/examples/hitl/components/cli_human_input.tsx(1 hunks)packages/server/examples/hitl/index.ts(1 hunks)packages/server/examples/hitl/src/app/tools.ts(1 hunks)packages/server/examples/hitl/src/app/workflow.ts(1 hunks)packages/server/next/app/api/chat/route.ts(3 hunks)packages/server/package.json(1 hunks)packages/server/src/handlers/chat.ts(1 hunks)packages/server/src/index.ts(1 hunks)packages/server/src/utils/hitl/events.ts(1 hunks)packages/server/src/utils/hitl/index.ts(1 hunks)packages/server/src/utils/hitl/pause.ts(1 hunks)packages/server/src/utils/hitl/resume.ts(1 hunks)packages/server/src/utils/hitl/snapshot.ts(1 hunks)packages/server/src/utils/stream.ts(4 hunks)packages/server/src/utils/workflow.ts(2 hunks)
✅ Files skipped from review due to trivial changes (5)
- .changeset/eight-moons-perform.md
- packages/server/examples/hitl/index.ts
- packages/server/package.json
- packages/server/src/utils/hitl/index.ts
- packages/server/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/server/src/utils/stream.ts
- packages/server/src/handlers/chat.ts
🧰 Additional context used
🧬 Code Graph Analysis (6)
packages/server/src/utils/hitl/pause.ts (2)
packages/server/src/utils/hitl/snapshot.ts (2)
ensureSnapshotWorkflowContext(69-78)saveSnapshot(29-42)packages/server/src/utils/hitl/events.ts (1)
humanResponseEvent(26-26)
packages/server/src/utils/hitl/resume.ts (2)
packages/server/src/utils/hitl/events.ts (1)
HumanResponseEventData(22-22)packages/server/src/utils/hitl/snapshot.ts (3)
SnapshotWorkflowContext(11-13)ensureSnapshotWorkflow(60-67)loadSnapshot(44-58)
packages/server/next/app/api/chat/route.ts (6)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message(6-10)packages/server/src/utils/workflow.ts (2)
runWorkflow(30-63)processWorkflowStream(65-111)packages/server/examples/hitl/src/app/workflow.ts (1)
workflowFactory(35-114)packages/server/src/utils/hitl/events.ts (1)
getHumanResponsesFromMessage(29-38)packages/server/src/utils/stream.ts (1)
toDataStream(42-94)packages/server/src/utils/hitl/pause.ts (1)
pauseForHumanInput(7-20)
packages/server/examples/hitl/src/app/workflow.ts (4)
packages/server/src/utils/hitl/events.ts (3)
humanInputEventSchema(5-8)humanInputEvent(15-15)humanResponseEvent(26-26)packages/server/examples/hitl/src/app/tools.ts (1)
cliExecutor(5-20)packages/server/src/utils/events.ts (1)
toAgentRunEvent(72-93)packages/server/src/utils/stream.ts (1)
writeResponseToStream(96-115)
packages/server/src/utils/workflow.ts (2)
packages/server/src/utils/hitl/events.ts (1)
HumanResponseEventData(22-22)packages/server/src/utils/hitl/resume.ts (1)
resumeWorkflowFromHumanResponses(10-28)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message(6-10)
🪛 LanguageTool
packages/server/examples/hitl/README.md
[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...
(AI_EN_LECTOR_MISSING_DETERMINER_THE)
⏰ Context from checks skipped due to timeout of 90000ms (55)
- GitHub Check: build
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: lint
🔇 Additional comments (8)
packages/server/src/utils/hitl/pause.ts (1)
15-16: Verifyrequest(humanResponseEvent)call signatureIf
requestexpects(eventDef, data?), passing only the event definition will raise at runtime.
Please double-check the expected overload; add an explicitundefinedor payload if needed.packages/server/src/utils/hitl/resume.ts (1)
24-26: Potential missingawaitforresumeIf
snapshotWorkflow.resume()is asynchronous (common in IO-heavy workflows), omittingawaitwill return a pending Promise instead of a ready context.- const context = snapshotWorkflow.resume(humanResponses, snapshot); + const context = await snapshotWorkflow.resume(humanResponses, snapshot);Please confirm the method’s return type and adjust accordingly.
packages/server/examples/hitl/components/cli_human_input.tsx (1)
35-50:editableCommandmay beundefined– ensure non-null data in payload
handleConfirmsends{ command: editableCommand }, which may beundefinedif editing is never enabled. Use the original command to avoid downstream parsing issues.- command: editableCommand, + command: inputEvent.command,packages/server/src/utils/workflow.ts (1)
65-65: LGTM! Export addition enables proper stream processing.Making
processWorkflowStreamexplicitly exported is appropriate for the HITL implementation.packages/server/next/app/api/chat/route.ts (2)
51-58: LGTM! Proper integration of HITL functionality.The updated
runWorkflowcall correctly uses the new signature and properly handles human responses and snapshot restoration.
67-67: 🛠️ Refactor suggestionConsider making the snapshot ID generation more robust.
Currently using
requestIddirectly for snapshot identification. IfrequestIdis undefined, the snapshot will be saved with "undefined" as the filename.Consider providing a fallback:
- onPauseForHumanInput: () => pauseForHumanInput(context, requestId), + onPauseForHumanInput: () => pauseForHumanInput(context, requestId || undefined),The
pauseForHumanInputfunction already generates a UUID if no ID is provided, so passingundefinedexplicitly would be cleaner than passing the string "undefined".Likely an incorrect or invalid review comment.
packages/server/src/utils/hitl/events.ts (1)
29-38:❓ Verification inconclusive
Verify that the Message type includes an annotations field.
The function accesses
message.annotationsbut the Message interface shown in relevant snippets doesn't include this field.Let me verify the Message type definition:
🏁 Script executed:
#!/bin/bash # Search for Message type definition with annotations field ast-grep --pattern 'interface Message { $$$ }' # Also search for any type extensions rg -A 10 "interface.*Message|type.*Message.*=" --type tsLength of output: 1543
🏁 Script executed:
#!/bin/bash # Locate the events.ts file(s) files=$(fd 'events.ts' -e ts) echo "Found events.ts files:" echo "$files" echo # For each file, print the imports and the function definition for f in $files; do echo "===== $f =====" sed -n '1,200p' "$f" echo doneLength of output: 12639
🏁 Script executed:
#!/usr/bin/env bash # Search for any TypeScript module augmentations for "ai" rg "declare module .*['\"]ai['\"]" -n .Length of output: 45
Confirm that
ai’s Message type declaresannotations
Please verify that theMessageinterface you’re importing from theaipackage includes an optionalannotationsfield (e.g.annotations?: unknown[]). Without this property, the call tomessage.annotations?.filter(...)will not type‐check.
- Inspect
node_modules/ai/index.d.ts(or equivalent) for theMessagedeclaration and confirmannotationsis present and correctly typed.packages/server/src/utils/hitl/snapshot.ts (1)
60-78: LGTM! Well-implemented type guards with clear error messages.The type guard functions properly check for the required methods and provide helpful error messages when the workflow doesn't support snapshots.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
packages/server/src/utils/hitl/events.ts (1)
11-17: 🛠️ Refactor suggestion
⚠️ Potential issue
typediscriminator should be a literal – keeps schema checks trivial
HumanInputEventData.typeis declared asstring, whereas the corresponding response event uses the literal"human_response".
Leaving it wide-open forces every consumer to pattern-match on additional fields or trust ad-hoc strings, defeating the purpose of a discriminator and breaking the niceschema.safeParsepattern you use below.-export type HumanInputEventData = { - type: string; - data?: any; - response: WorkflowEvent<HumanResponseEventData>; -}; +export type HumanInputEventData = { + type: "human_input"; + /** Optional indicator for concrete UI component */ + component?: string; + data?: unknown; + response: WorkflowEvent<HumanResponseEventData>; +};Update any callers that relied on arbitrary
typevalues to use the newcomponent(or another dedicated) property instead.
🧹 Nitpick comments (2)
packages/server/src/utils/hitl/events.ts (2)
40-43: Dead-code?BaseEventGetis exported but never referenced
BaseEventGetis defined yet unused in this module. If nothing imports it elsewhere, drop the type alias to avoid API noise.
44-68: Potential unbounded growth ofderivedEventsset
workflowBaseEventstores every derived event ever created inderivedEvents, which lives for the lifetime of the process. Long-running servers that generate events dynamically (e.g., per request) will leak memory.Consider one of:
-const derivedEvents = new Set<WorkflowEvent<any>>(); +// WeakSet lets GC collect unused events, preventing leaks +const derivedEvents = new WeakSet<WorkflowEvent<any>>();or expose a
dispose()/clear()API if event factories are short-lived.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
packages/server/examples/hitl/src/app/events.ts(1 hunks)packages/server/examples/hitl/src/app/workflow.ts(1 hunks)packages/server/next/app/api/chat/route.ts(3 hunks)packages/server/src/handlers/chat.ts(1 hunks)packages/server/src/utils/hitl/events.ts(1 hunks)packages/server/src/utils/hitl/pause.ts(1 hunks)packages/server/src/utils/stream.ts(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
- packages/server/examples/hitl/src/app/events.ts
- packages/server/examples/hitl/src/app/workflow.ts
- packages/server/src/utils/hitl/pause.ts
- packages/server/next/app/api/chat/route.ts
- packages/server/src/utils/stream.ts
- packages/server/src/handlers/chat.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message(6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: lint
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
packages/server/examples/hitl/README.md (2)
1-1: Refine wording for clarity and grammar.The article “the” and hyphens improve readability and correctness:
- This example demonstrates human in the loop workflow. Export OpenAI API key and start the server in dev mode. + This example demonstrates the human-in-the-loop workflow. To run the example, export your OpenAI API key and start the server in development mode.🧰 Tools
🪛 LanguageTool
[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...(AI_EN_LECTOR_MISSING_DETERMINER_THE)
3-6: Add directory navigation step.Users need to
cdinto this example before running commands. For instance:```bash + cd packages/server/examples/hitl export OPENAI_API_KEY=<your-openai-api-key> npx nodemon --exec tsx index.ts --ignore output/*</blockquote></details> </blockquote></details> <details> <summary>📜 Review details</summary> **Configuration used: .coderabbit.yaml** **Review profile: CHILL** **Plan: Pro** <details> <summary>📥 Commits</summary> Reviewing files that changed from the base of the PR and between 8f107f575fbe12d9e91e801858772450996ebdc1 and 2c062c924a9fed053c158a6cee02e14f0a8b3de2. </details> <details> <summary>📒 Files selected for processing (1)</summary> * `packages/server/examples/hitl/README.md` (1 hunks) </details> <details> <summary>🧰 Additional context used</summary> <details> <summary>🪛 LanguageTool</summary> <details> <summary>packages/server/examples/hitl/README.md</summary> [uncategorized] ~1-~1: You might be missing the article “the” here. Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod... (AI_EN_LECTOR_MISSING_DETERMINER_THE) </details> </details> </details> <details> <summary>⏰ Context from checks skipped due to timeout of 90000ms (58)</summary> * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver) * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming) * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming) * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver) * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming) * GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver) * GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming) * GitHub Check: lint * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming) * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming) * GitHub Check: Unit Tests (ubuntu-latest, 3.9) * GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming) * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming) * GitHub Check: Unit Tests (windows-latest, 3.9) * GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver) * GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver) * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming) * GitHub Check: Type Check * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver) * GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver) * GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming) * GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming) * GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming) </details> </details> <!-- This is an auto-generated comment by CodeRabbit for review status -->
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (5)
packages/server/examples/hitl/README.md (5)
1-1: Clarify Human-in-the-Loop terminology
Consider hyphenating and introducing the acronym on the main title for consistency.Example:
-# Human in the Loop +# Human-in-the-Loop (HITL)
9-13: Add dependency installation step
Currently the instructions skip installing dependencies or entering the example directory. Include steps to install packages and switch to the example folder.Example:
## Getting Started -### Environment Setup +### Prerequisites & Environment Setup + +- Navigate to the example directory: +```bash +cd packages/server/examples/hitl +``` + -Export your OpenAI API key: +Install dependencies: +```bash +npm install +``` + +Export your OpenAI API key: ```bash export OPENAI_API_KEY=<your-openai-api-key>--- `27-29`: **Specify code block language for URL** The fenced code block containing the URL lacks a language specifier. Add one to satisfy markdownlint. ```diff -``` -http://localhost:3000 -``` +```text +http://localhost:3000 +```🧰 Tools
🪛 markdownlint-cli2 (0.17.2)
27-27: Fenced code blocks should have a language specified
null(MD040, fenced-code-language)
64-64: Remove unnecessary comma
The comma before "that" is superfluous.-HITL also needs a custom UI component, that is shown when the LlamaIndexServer receives the `cliHumanInputEvent`. +HITL also needs a custom UI component that is shown when the LlamaIndexServer receives the `cliHumanInputEvent`.
84-84: Add missing comma for compound sentence
Insert a comma before "and" to connect the independent clauses.-This component displays the command to execute and the user can choose to execute or cancel the command execution. +This component displays the command to execute, and the user can choose to execute or cancel the command execution.🧰 Tools
🪛 LanguageTool
[uncategorized] ~84-~84: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...(COMMA_COMPOUND_SENTENCE_2)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/server/README.md(1 hunks)packages/server/examples/hitl/README.md(1 hunks)packages/server/examples/tsconfig.json(1 hunks)
✅ Files skipped from review due to trivial changes (2)
- packages/server/README.md
- packages/server/examples/tsconfig.json
🧰 Additional context used
🪛 LanguageTool
packages/server/examples/hitl/README.md
[style] ~39-~39: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...
(DO_ACHIEVE)
[uncategorized] ~44-~44: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[uncategorized] ~84-~84: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...
(COMMA_COMPOUND_SENTENCE_2)
[uncategorized] ~141-~141: A punctuation mark might be missing here.
Context: ...ls The CLI executor tool is defined in tools.ts: ```types...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
🪛 markdownlint-cli2 (0.17.2)
packages/server/examples/hitl/README.md
27-27: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: lint
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (2)
packages/server/src/utils/hitl/events.ts (2)
10-14: Discriminator should be a fixed literal, notstring
HumanInputEventData.typeis still declared asstring, making it impossible to discriminate reliably against other event kinds. Please narrow it to a literal value (e.g."human_input") to mirrorHumanResponseEventData.
26-34:message.annotationsis not onai.Message; add augmentation or local type
The upstreamMessageinterface exposes onlyid,content, androle. Accessingannotationswill either produce a TS error or devolve toany. Provide a module augmentation or introduce a localExtendedMessagetype that includesannotations?: unknown[].
🧹 Nitpick comments (1)
packages/server/src/utils/hitl/events.ts (1)
27-27: Preferz.unknown()overz.any()for stronger type safety- const schema = z.object({ type: z.literal("human_response"), data: z.any() }); + const schema = z.object({ type: z.literal("human_response"), data: z.unknown() });Using
unknownforces downstream consumers to perform explicit validation instead of silently accepting anything.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (4)
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json(1 hunks)packages/server/package.json(2 hunks)packages/server/project-config/package.json(1 hunks)packages/server/src/utils/hitl/events.ts(1 hunks)
✅ Files skipped from review due to trivial changes (2)
- packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
- packages/server/project-config/package.json
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/server/package.json
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message(6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: lint
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
🧹 Nitpick comments (15)
packages/create-llama/helpers/typescript.ts (1)
86-92: Consider pre-computing a Set for O(1) look-ups
NO_DATA_USE_CASES.includes(useCase)is fine for a 3-element array, but the list is growing. Turning the array into aSetinconstant.tswould avoid repeated linear searches and conveys intent that values are unique.- export const NO_DATA_USE_CASES: TemplateUseCase[] = [ - "code_generator", - "document_generator", - "hitl", - ]; + export const NO_DATA_USE_CASES = new Set<TemplateUseCase>([ + "code_generator", + "document_generator", + "hitl", + ]);and here:
- if (useCase && NO_DATA_USE_CASES.includes(useCase)) { + if (useCase && NO_DATA_USE_CASES.has(useCase)) {packages/create-llama/helpers/index.ts (1)
103-104: Minor readability nit – collapse boolean
const shouldRunGenerate = !useCase || !NO_DATA_USE_CASES.includes(useCase);is correct but a little dense.
Consider inlining the check or extracting a helper for clarity, e.g.:const isDatafulUseCase = !useCase || !NO_DATA_USE_CASES.includes(useCase); if (isDatafulUseCase) { ... }packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (1)
24-30: Avoid manual list maintenance
useCasesnow repeats the same literals held inNO_DATA_USE_CASES. To keep tests in sync automatically:-const useCases: TemplateUseCase[] = [ - "agentic_rag", - ... - "document_generator", - "hitl", -]; +const useCases: TemplateUseCase[] = [ + "agentic_rag", + "deep_research", + "financial_report", + ...NO_DATA_USE_CASES, +];This eliminates the risk of forgetting to add future no-data use cases in two places.
packages/create-llama/questions/simple.ts (3)
15-17: Growing union hints at maintainability pressureThe manual
AppTypeunion keeps expanding. You might consider deriving it from the keys oflookupor fromNO_DATA_USE_CASESto avoid divergent definitions.
62-67: Choice list duplicationThe choices array duplicates the literals in
AppTypeandNO_DATA_USE_CASES. Same maintainability concern as above—centralise to one source where feasible.
182-187: Lookup entry added – consider DRYing repeated blocks
code_generator,document_generator, andhitlshare identical entries. Consolidating via a helper or spreading a base object can cut noise.const BASE_NO_DATA: Pick<...> = { template: "llamaindexserver", dataSources: [], tools: [], modelConfig: MODEL_GPT41, }; code_generator: BASE_NO_DATA, document_generator: BASE_NO_DATA, hitl: BASE_NO_DATA,packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts (1)
5-8: Name clash risk
name: "cli_executor"is exported from a template package as a global tool name. If multiple projects load these templates side-by-side, name collisions may occur. Consider prefixing with the package or workflow id (hitl_cli_executor).packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx (1)
75-80: Disabled input hasonChange– dead codeBecause the
<input>isdisabled,onChangenever fires andeditableCommandcan’t actually be edited.
• If edits should be allowed, removedisabled.
• If not, drop theonChangehandler and local state altogether.packages/create-llama/templates/components/use-cases/python/hitl/events.py (1)
8-13: Expose stderr to the workflow
CLIHumanResponseEventcaptures onlyexecuteandcommand. Consider adding optionalstderr/exit_codeso the workflow can detect failures.🧰 Tools
🪛 Pylint (3.3.7)
[refactor] 8-8: Too few public methods (0/2)
(R0903)
packages/create-llama/templates/components/use-cases/python/hitl/README-template.md (1)
26-29: Minor markdown/style fixesAdd a language tag to the fenced block and fix the article typo:
-``` +```bash ... -You can start by sending an request +You can start by sending a request🧰 Tools
🪛 markdownlint-cli2 (0.17.2)
26-26: Fenced code blocks should have a language specified
null(MD040, fenced-code-language)
packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md (1)
27-31: Incorrect file reference & article typo
workflow.pyis a Python file – the TS template probably wantsworkflow.ts. Also fix “an request”.-To update the workflow, you can modify the code in [`workflow.py`](app/workflow.py). +To update the workflow, you can modify the code in [`workflow.ts`](src/app/workflow.ts). -You can start by sending an request +You can start by sending a request🧰 Tools
🪛 LanguageTool
[uncategorized] ~27-~27: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code inworkflow.py. You can...(AI_EN_LECTOR_MISSING_PUNCTUATION)
[misspelling] ~29-~29: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...(EN_A_VS_AN)
🪛 markdownlint-cli2 (0.17.2)
31-31: Fenced code blocks should have a language specified
null(MD040, fenced-code-language)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts (2)
58-66: Rename to avoid double-dataconfusionInside the handler,
data.datais easy to misread. Destructure once and rename:-const { command, execute } = data.data; +const { command, execute } = data; // `data` is already the payloadThis keeps the API self-explanatory.
82-88: Avoid mutating user-supplied array
chatHistory.push(...)mutates the originalmessagesarray captured from the request.
Clone first to keep function-local state pure:-const chatHistory = messages; +const chatHistory = [...messages];packages/create-llama/templates/components/use-cases/python/hitl/workflow.py (2)
51-55: Compact OS checkMinor readability win:
cli_language = "bash" if platform.system() in ("Linux", "Darwin") else "cmd"🧰 Tools
🪛 Ruff (0.11.9)
52-55: Use ternary operator
cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd"instead ofif-else-blockReplace
if-else-block withcli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd"(SIM108)
🪛 Pylint (3.3.7)
[refactor] 52-52: Consider merging these comparisons with 'in' by using 'os_name in ('Linux', 'Darwin')'. Use a set instead if elements are hashable.
(R1714)
80-87: Drop unreachableelseblockThe
returninside theif ev.executebranch already exits the function.
Removing the extraelsereduces indentation noise.-if ev.execute: - ... - return StopEvent(result=res.stdout or res.stderr) -else: - return StopEvent(result=None) +if not ev.execute: + return StopEvent(result=None) +... +return StopEvent(result=res.stdout or res.stderr)🧰 Tools
🪛 Pylint (3.3.7)
[refactor] 80-87: Unnecessary "else" after "return", remove the "else" and de-indent the code inside it
(R1705)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (17)
.changeset/eight-moons-perform.md(1 hunks)packages/create-llama/e2e/python/resolve_dependencies.spec.ts(1 hunks)packages/create-llama/e2e/shared/llamaindexserver_template.spec.ts(1 hunks)packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts(3 hunks)packages/create-llama/helpers/constant.ts(1 hunks)packages/create-llama/helpers/index.ts(2 hunks)packages/create-llama/helpers/types.ts(1 hunks)packages/create-llama/helpers/typescript.ts(2 hunks)packages/create-llama/questions/simple.ts(5 hunks)packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx(1 hunks)packages/create-llama/templates/components/use-cases/python/hitl/README-template.md(1 hunks)packages/create-llama/templates/components/use-cases/python/hitl/events.py(1 hunks)packages/create-llama/templates/components/use-cases/python/hitl/workflow.py(1 hunks)packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md(1 hunks)packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts(1 hunks)packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts(1 hunks)packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts(1 hunks)
✅ Files skipped from review due to trivial changes (5)
- packages/create-llama/helpers/constant.ts
- packages/create-llama/e2e/shared/llamaindexserver_template.spec.ts
- packages/create-llama/helpers/types.ts
- packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts
- packages/create-llama/e2e/python/resolve_dependencies.spec.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- .changeset/eight-moons-perform.md
🧰 Additional context used
🧬 Code Graph Analysis (6)
packages/create-llama/helpers/index.ts (1)
packages/create-llama/helpers/constant.ts (1)
NO_DATA_USE_CASES(11-15)
packages/create-llama/helpers/typescript.ts (1)
packages/create-llama/helpers/constant.ts (1)
NO_DATA_USE_CASES(11-15)
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (1)
packages/create-llama/helpers/constant.ts (1)
NO_DATA_USE_CASES(11-15)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts (4)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts (1)
cliExecutor(5-20)packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts (2)
cliHumanInputEvent(3-7)cliHumanResponseEvent(9-12)packages/server/src/utils/events.ts (1)
toAgentRunEvent(72-93)packages/server/src/utils/stream.ts (1)
writeResponseToStream(101-120)
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py (1)
packages/create-llama/templates/components/use-cases/python/hitl/events.py (3)
CLICommand(15-16)CLIHumanInputEvent(20-34)CLIHumanResponseEvent(8-12)
packages/create-llama/questions/simple.ts (1)
packages/create-llama/helpers/constant.ts (1)
NO_DATA_USE_CASES(11-15)
🪛 Pylint (3.3.7)
packages/create-llama/templates/components/use-cases/python/hitl/events.py
[refactor] 8-8: Too few public methods (0/2)
(R0903)
[refactor] 15-15: Too few public methods (0/2)
(R0903)
[refactor] 20-20: Too few public methods (0/2)
(R0903)
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py
[refactor] 52-52: Consider merging these comparisons with 'in' by using 'os_name in ('Linux', 'Darwin')'. Use a set instead if elements are hashable.
(R1714)
[refactor] 80-87: Unnecessary "else" after "return", remove the "else" and de-indent the code inside it
(R1705)
🪛 LanguageTool
packages/create-llama/templates/components/use-cases/python/hitl/README-template.md
[uncategorized] ~38-~38: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code in workflow.py. You can...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[misspelling] ~40-~40: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...
(EN_A_VS_AN)
[style] ~54-~54: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...
(DO_ACHIEVE)
[uncategorized] ~59-~59: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[uncategorized] ~99-~99: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...
(COMMA_COMPOUND_SENTENCE_2)
packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md
[uncategorized] ~27-~27: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code in workflow.py. You can...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[misspelling] ~29-~29: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...
(EN_A_VS_AN)
[style] ~43-~43: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...
(DO_ACHIEVE)
[uncategorized] ~48-~48: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[uncategorized] ~88-~88: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...
(COMMA_COMPOUND_SENTENCE_2)
🪛 markdownlint-cli2 (0.17.2)
packages/create-llama/templates/components/use-cases/python/hitl/README-template.md
26-26: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
42-42: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md
7-7: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
13-13: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
31-31: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py
52-55: Use ternary operator cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd" instead of if-else-block
Replace if-else-block with cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd"
(SIM108)
⏰ Context from checks skipped due to timeout of 90000ms (55)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (7)
packages/create-llama/helpers/typescript.ts (1)
7-7: Good move – centralised constant importReplacing hard-coded literals with the shared
NO_DATA_USE_CASESconstant increases maintainability and removes duplication.packages/create-llama/helpers/index.ts (1)
7-7: 👍 Shared constant keeps logic DRYImporting
NO_DATA_USE_CASEShere removes magic strings and aligns behaviour with the other helpers.packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2)
6-6: Consistent source of truth for exclusions – nicePulling
NO_DATA_USE_CASESinto the test reduces drift between prod code and tests.
88-89: LGTM – dynamic skip uses shared constant
The new condition correctly prevents LlamaParse runs for hitl / other no-data use cases.packages/create-llama/questions/simple.ts (2)
2-2: 👍 Centralised constant importKeeps the questionnaire logic aligned with helper rules.
92-94: Logic is correct – name conveys intent
shouldAskLlamaCloudcleanly expresses the new rule based on the shared constant.packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx (1)
30-34:.at(-1)requires Node 18 / modern browsers
.at()is Stage-4 but still absent in some LTS environments. If broader compatibility is a goal, replace witharr[arr.length - 1].
Pattern:
Summary by CodeRabbit
New Features
Enhancements
Documentation