Skip to content

Conversation

@thucpn
Copy link
Collaborator

@thucpn thucpn commented Jun 9, 2025

Pattern:

import { createWorkflow, workflowEvent } from "@llama-flow/core";
import { request, withSnapshot } from "@llama-flow/core/middleware/snapshot";
import fs from "fs";

///////////// EVENTS /////////////
const startEvent = workflowEvent<string>({
	debugLabel: "Start event",
});
const humamInputEvent = workflowEvent<string>({
	debugLabel: "Human input event",
});
const humanResponseEvent = workflowEvent<string>({
	debugLabel: "Human response event",
});
const stopEvent = workflowEvent<string>({
	debugLabel: "Stop event",
});

///////////// CLIENT CODE: WORKFLOW DEFINITION /////////////
const workflowFactory = () => {
	const workflow = withSnapshot(createWorkflow());

	workflow.handle([startEvent], ({ data: startInput }) => {
		console.log("Start event---");

		const shouldRequestHuman = startInput.length < 3; // condition to request human

		if (shouldRequestHuman) {
			// request human input with a reason
			return humamInputEvent.with("input length < 3");
		}

		// if no need to request human, stop the workflow
		return stopEvent.with("Stop without request human");
	});

	workflow.handle([humanResponseEvent], ({ data: userInput }) => {
		// userInput is the input from human
		// we can trigger LLM with user input to get final result
		const sampleFinalResult = "Sample final result for human input: " + userInput;
		return stopEvent.with(sampleFinalResult);
	});

	return workflow;
};

///////////// SERVER CODE: RUN & RESUME WORKFLOW /////////////
const runWorkflow = async () => {
	const workflow = workflowFactory();
	const { sendEvent, stream, snapshot, onRequest } = workflow.createContext();

	// try send a start event with empty input to trigger human input
	sendEvent(startEvent.with(""));

	// consume stream
	for await (const event of stream) {
		if (humamInputEvent.include(event)) {
			console.log("Need to request human input with reason: ", event.data);

			// send a request event
			sendEvent(request(humanResponseEvent));

			// save snapshot
			const [_, snapshotData] = await snapshot();
			console.log("Snapshot data: ", snapshotData);
			fs.writeFileSync("snapshot.json", JSON.stringify(snapshotData, null, 2));

			// stop the workflow
			break;
		}

		console.log(event.data);
	}
};

const resumeWorkflow = async () => {
	const workflow = workflowFactory();
	const snapshotData = JSON.parse(fs.readFileSync("snapshot.json", "utf8"));
	const { stream } = workflow.resume(["test input"], snapshotData);

	for await (const event of stream) {
		console.log(event.data);
	}
};

async function main() {
	await runWorkflow();
	await new Promise((resolve) => setTimeout(resolve, 2000));
	await resumeWorkflow();
}

main();

Summary by CodeRabbit

  • New Features

    • Introduced human-in-the-loop (HITL) support, enabling workflows to pause for and resume from human input.
    • Added the ability to save and restore workflow snapshots, allowing for workflow continuation after user responses.
    • Provided a CLI command execution tool and example workflow demonstrating HITL interactions.
    • New UI component for confirming CLI commands during HITL workflows.
    • Example server and documentation for running HITL workflows.
  • Enhancements

    • Improved chat and workflow handling to support pausing, resuming, and streaming with human input events.
    • Expanded utility exports for easier integration of HITL and streaming functionalities.
  • Documentation

    • Added README for HITL workflow example setup and usage instructions.

@thucpn thucpn requested a review from marcusschiesser June 9, 2025 12:36
@changeset-bot
Copy link

changeset-bot bot commented Jun 9, 2025

🦋 Changeset detected

Latest commit: 5b06106

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 4 packages
Name Type
@llamaindex/server Patch
create-llama Patch
llamaindex-server-examples Patch
@create-llama/llama-index-server Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented Jun 9, 2025

Walkthrough

This update introduces human-in-the-loop (HITL) workflow capabilities to the server package. It adds event schemas, snapshot/resume logic, and stream handling for human input, alongside a CLI-based workflow example. Core workflow and streaming utilities are refactored to support HITL events, and new exports are provided for HITL-related modules and helpers.

Changes

File(s) Change Summary
packages/server/src/handlers/chat.ts, packages/server/next/app/api/chat/route.ts Refactored chat handling and API route to integrate HITL workflow context, snapshot restoration, and human input stream handling.
packages/server/src/utils/workflow.ts Refactored runWorkflow to accept HITL context, return workflow context, and support resuming from human responses. Exported processWorkflowStream.
packages/server/src/utils/stream.ts Added HITL event handling to data stream, new onPauseForHumanInput callback, and utility to write response to stream.
packages/server/src/utils/hitl/events.ts Added schemas, types, and helpers for HITL events and response extraction.
packages/server/src/utils/hitl/pause.ts, resume.ts, snapshot.ts Added utilities for pausing workflows, resuming from human responses, and managing workflow snapshots.
packages/server/src/utils/hitl/index.ts, packages/server/src/utils/index.ts Added and updated re-exports for consolidated HITL utilities.
packages/server/src/index.ts Re-exported HITL and stream utilities from main entry point.
packages/server/examples/hitl/README.md Added instructions for running the HITL workflow example.
packages/server/examples/hitl/components/cli_human_input.tsx Added React component for CLI human input confirmation in HITL workflow.
packages/server/examples/hitl/index.ts Added example server setup for HITL workflow.
packages/server/examples/hitl/src/app/tools.ts Added CLI executor tool for running shell commands.
packages/server/examples/hitl/src/app/workflow.ts Added HITL workflow integrating LLM, CLI tool, and human approval step.
packages/server/examples/hitl/src/app/events.ts Added CLI-specific HITL event schemas for input and response.
.changeset/eight-moons-perform.md Added changeset for HITL support in server package.
packages/server/package.json Updated peer dependency version of @llamaindex/workflow.
packages/server/examples/tsconfig.json Excluded hitl/components directory from TypeScript compilation.
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json, packages/server/project-config/package.json, packages/server/examples/package.json Updated dependency versions for @llamaindex/workflow and tsx.
packages/create-llama/e2e/python/resolve_dependencies.spec.ts, packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts Added "hitl" to test use cases for dependency resolution.
packages/create-llama/e2e/shared/llamaindexserver_template.spec.ts Disabled "hitl" test case in shared template tests.
packages/create-llama/helpers/constant.ts Added NO_DATA_USE_CASES array including "hitl".
packages/create-llama/helpers/index.ts Updated generate script condition to exclude use cases in NO_DATA_USE_CASES.
packages/create-llama/helpers/types.ts Added "hitl" to TemplateUseCase type union.
packages/create-llama/helpers/typescript.ts Updated file removal condition to use NO_DATA_USE_CASES for "hitl".
packages/create-llama/questions/simple.ts Added "hitl" app type option and lookup entry for CLI command workflow with human review.
packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx Added React component for CLI command confirmation UI.
packages/create-llama/templates/components/use-cases/python/hitl/events.py Added Pydantic models for CLI human input and response events.
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py Added Python HITL CLI workflow with command generation, human approval, and execution.
packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md Added README template describing TypeScript HITL CLI workflow setup and usage.
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts Added typed TypeScript HITL CLI human input and response events.
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts Added TypeScript CLI executor tool for synchronous command execution.
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts Added TypeScript HITL CLI workflow integrating LLM, tool calls, and human approval events.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant API
    participant Workflow
    participant HITL Utils
    participant Human (UI)
    participant Snapshot Store

    Client->>API: Send chat request (may include id/human responses)
    API->>Workflow: runWorkflow({ workflow, input, human })
    alt Resume from snapshot
        Workflow->>HITL Utils: resumeWorkflowFromHumanResponses
        HITL Utils->>Snapshot Store: loadSnapshot(requestId)
        HITL Utils-->>Workflow: Resume context with human responses
    else New workflow
        Workflow-->>API: Create new context, start event
    end
    Workflow-->>API: Return workflow context/stream
    API->>API: processWorkflowStream
    loop Stream events
        API->>Client: Stream assistant responses/events
        alt HITL event (pause for input)
            API->>HITL Utils: pauseForHumanInput(context, responseEvent, requestId)
            HITL Utils->>Snapshot Store: saveSnapshot(requestId, snapshot)
            API->>Human (UI): Request human input
            Human (UI)->>API: Send human response
            API->>Workflow: Resume workflow with human response
        end
    end
Loading

Possibly related PRs

Suggested reviewers

  • marcusschiesser

Poem

🐇
A hop, a skip, a pause for thought—
Now workflows wait when humans ought.
Snapshots saved, commands approved,
HITL magic, smoothly moved.
With streams and tools, the code can grow,
And rabbits cheer, "Onward we go!"
🥕✨

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@thucpn thucpn marked this pull request as draft June 9, 2025 12:36
@thucpn thucpn changed the title feat: support human in the loop for TS [WIP] feat: support human in the loop for TS Jun 9, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (3)
packages/server/src/utils/stream.ts (1)

29-30: Documentation improvement suggestion

The comment has extra spaces that should be cleaned up for consistency.

-  /** `onPauseForHumanInput`: Called when human input   event is emitted. */
+  /** `onPauseForHumanInput`: Called when human input event is emitted. */
packages/server/src/utils/hitl.ts (2)

1-1: Consider removing the ESLint disable comment.

The blanket disable of @typescript-eslint/no-explicit-any suggests heavy usage of any types. Consider using more specific types or generics where possible to maintain type safety.

-/* eslint-disable @typescript-eslint/no-explicit-any */

18-22: Address the TODO for file-based persistence.

The in-memory storage is not suitable for production use. Consider implementing file-based or database persistence for workflow snapshots.

Would you like me to generate an implementation for file-based snapshot persistence using the filesystem or a database solution?

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 66b81e5 and e4c07f9.

📒 Files selected for processing (5)
  • packages/server/src/handlers/chat.ts (3 hunks)
  • packages/server/src/utils/hitl.ts (1 hunks)
  • packages/server/src/utils/index.ts (1 hunks)
  • packages/server/src/utils/stream.ts (3 hunks)
  • packages/server/src/utils/workflow.ts (2 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
packages/server/src/utils/stream.ts (1)
packages/server/src/utils/hitl.ts (2)
  • HumanInputEventData (31-34)
  • humanInputEvent (39-39)
packages/server/src/handlers/chat.ts (3)
packages/server/src/utils/hitl.ts (3)
  • getHumanResponseFromMessage (58-68)
  • createWorkflowContextFromHumanResponse (70-96)
  • pauseForHumanInput (98-113)
packages/server/src/utils/workflow.ts (1)
  • runWorkflow (26-46)
packages/server/src/utils/stream.ts (1)
  • toDataStream (39-90)
packages/server/src/utils/hitl.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
  • Message (6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: lint
🔇 Additional comments (6)
packages/server/src/utils/index.ts (1)

4-4: LGTM!

The export addition follows the established pattern and correctly exposes the HITL utilities.

packages/server/src/utils/stream.ts (1)

68-73: LGTM! Stream handling for human input events is well implemented.

The implementation correctly:

  • Invokes the callback when a human input event is detected
  • Writes the event data as a message annotation for persistence
  • Breaks the stream to pause processing, which is essential for HITL workflows
packages/server/src/handlers/chat.ts (1)

27-27:

❓ Verification inconclusive

Address TODO and add validation for requestId

The TODO comment indicates this implementation might be incomplete. Additionally, the requestId is extracted without validation, which could cause issues if it's missing or not a string.

-    const requestId = req.headers["x-request-id"] as string; // TODO: update for chat route also
+    const requestId = req.headers["x-request-id"];
+    if (!requestId || typeof requestId !== "string") {
+      return sendJSONResponse(res, 400, {
+        error: "Missing or invalid x-request-id header",
+      });
+    }

Can you clarify what needs to be updated for the chat route as mentioned in the TODO?


🏁 Script executed:

#!/bin/bash
set -euxo pipefail

# Locate and print the beginning of the chat handler
FILE=$(fd chat.ts)
echo "Found chat handler at: $FILE"
echo "---- Chat Handler (lines 1–200) ----"
sed -n '1,200p' "$FILE"

# Show occurrences of x-request-id in all handlers for comparison
echo
echo "---- x-request-id usages in handlers ----"
rg -n -C2 "x-request-id" packages/server/src/handlers

# Show patterns of requestId extraction across handlers
echo
echo "---- requestId extraction in handlers ----"
rg -n -C2 "const requestId" packages/server/src/handlers

Length of output: 2032


#!/bin/bash
set -euxo pipefail

# Locate the specific chat handler in the server package
echo "Searching for server chat handler file..."
find packages/server/src/handlers -type f -name chat.ts

# Display the first 200 lines of the handler to inspect the TODO and header extraction
FILE="packages/server/src/handlers/chat.ts"
echo "---- $FILE (lines 1–200) ----"
sed -n '1,200p' "$FILE"

# Show occurrences of x-request-id and requestId usage in this file
echo
echo "---- x-request-id usage in $FILE ----"
rg -n -C2 "x-request-id" "$FILE"

echo
echo "---- requestId extraction in $FILE ----"
rg -n -C2 "const requestId" "$FILE"

#!/bin/bash
set -euxo pipefail

# Verify chat handler exists
FILE="packages/server/src/handlers/chat.ts"
echo "Exists: $( [ -f "$FILE" ] && echo true || echo false )"

# Show header extraction and TODO around line 27
echo "---- $FILE (lines 1–60) ----"
sed -n '1,60p' "$FILE"

echo
echo "---- $FILE (lines 20–40) ----"
sed -n '20,40p' "$FILE"

# Show x-request-id usage in chat handler
echo
echo "---- x-request-id in chat.ts ----"
rg -n -C2 "x-request-id" "$FILE"

# Compare requestId extraction in other handlers
echo
echo "---- requestId in other handlers ----"
rg -n -C2 "const requestId" packages/server/src/handlers

#!/bin/bash
set -euxo pipefail

# Path to the chat handler in the server package
FILE="packages/server/src/handlers/chat.ts"
if [ ! -f "$FILE" ]; then
  echo "Error: $FILE not found" >&2
  exit 1
fi

# Show the start of the file including the TODO
echo "---- $FILE (lines 1–60) ----"
sed -n '1,60p' "$FILE"

# Narrow in on the TODO comment area around line 27
echo
echo "---- $FILE (lines 20–40) ----"
sed -n '20,40p' "$FILE"

# Show any x-request-id header references in this file
echo
echo "---- x-request-id usage in $FILE ----"
rg -n -C2 "x-request-id" "$FILE" || echo "(none found)"

# Compare how other handlers extract requestId
echo
echo "---- requestId extraction in all server handlers ----"
rg -n -C2 "const requestId" packages/server/src/handlers || echo "(none found)"

Unable to locate chat.ts TODO – please verify

I couldn’t find or inspect the // TODO: update for chat route also line in packages/server/src/handlers/chat.ts. Could you:

  • Confirm that the file path is correct
  • Share lines 20–40 (or surrounding the TODO) from that file

With that snippet I can verify the existing pattern in other handlers and clarify what “update for chat route” entails.

packages/server/src/utils/workflow.ts (1)

34-40: Approve the workflow execution refactoring.

The change from calling workflow.run() to using context.sendEvent() and context.stream is correct and aligns with the HITL workflow pattern where contexts can be resumed from snapshots.

packages/server/src/utils/hitl.ts (2)

12-16: Approve the type inference approach.

Using ReturnType to infer snapshot types from the withSnapshot function is a clever approach when the library doesn't export these types directly.


31-39: Approve the event type definitions.

The HumanInputEventData structure and humanInputEvent definition are well-designed for extensibility.

@thucpn thucpn requested a review from leehuwuj June 10, 2025 10:51
@thucpn thucpn changed the title [WIP] feat: support human in the loop for TS feat: support human in the loop for TS Jun 11, 2025
@thucpn thucpn marked this pull request as ready for review June 11, 2025 03:35
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

♻️ Duplicate comments (1)
packages/server/src/utils/workflow.ts (1)

30-41: Fix incorrect runWorkflow calls to use the new signature.

The function signature has been changed to accept an object parameter, but some call sites may still be using the old signature.

🧹 Nitpick comments (4)
packages/server/examples/hitl/README.md (1)

1-1: Improve phrasing & article usage in README headline

The opening sentence is grammatically off and slightly vague. A concise alternative would be:

-This example demonstrates human in the loop workflow.
+This example demonstrates **a human-in-the-loop workflow**.

This small tweak clarifies intent and reads more naturally.

🧰 Tools
🪛 LanguageTool

[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

packages/server/src/utils/hitl/pause.ts (1)

15-19: Consider returning the generated requestId

pauseForHumanInput generates or receives a requestId, persists a snapshot, but discards the ID. Returning it lets callers persist or surface the identifier to clients without guessing.

-  await saveSnapshot(requestId, snapshotData);
+  await saveSnapshot(requestId, snapshotData);
+
+  // Expose the identifier so the caller / client can resume later
+  return requestId;

This is a backwards-compatible, single-line change that improves ergonomics.

packages/server/examples/hitl/components/cli_human_input.tsx (1)

75-80: disabled input cannot be edited – remove onChange / state

The field is rendered with disabled, so onChange will never fire. editableCommand state is redundant unless you intend to make the field editable later.

-        <input
-          disabled
-          type="text"
-          value={editableCommand || ""}
-          onChange={(e) => setEditableCommand(e.target.value)}
+        <input
+          disabled
+          type="text"
+          value={inputEvent.command}
           className="my-2 w-full overflow-x-auto rounded border border-gray-300 bg-gray-100 p-3 font-mono text-xs text-gray-800"
         />
packages/server/src/utils/hitl/events.ts (1)

5-8: Consider using more specific types instead of z.any().

Using z.any() for the data field removes type safety. Consider using a discriminated union or generic types for better type safety.

Example approach using discriminated unions:

-export const humanInputEventSchema = z.object({
-  type: z.string(), // An identifier for the input component in UI
-  data: z.any(), // The data to be sent to the input component in UI
-});
+export const humanInputEventSchema = z.discriminatedUnion("type", [
+  z.object({
+    type: z.literal("cli_human_input"),
+    data: z.object({
+      command: z.string(),
+    }),
+  }),
+  // Add other input types as needed
+  z.object({
+    type: z.string(),
+    data: z.unknown(), // Fallback for unknown types
+  }),
+]);
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e4c07f9 and d93ee94.

📒 Files selected for processing (17)
  • .changeset/eight-moons-perform.md (1 hunks)
  • packages/server/examples/hitl/README.md (1 hunks)
  • packages/server/examples/hitl/components/cli_human_input.tsx (1 hunks)
  • packages/server/examples/hitl/index.ts (1 hunks)
  • packages/server/examples/hitl/src/app/tools.ts (1 hunks)
  • packages/server/examples/hitl/src/app/workflow.ts (1 hunks)
  • packages/server/next/app/api/chat/route.ts (3 hunks)
  • packages/server/package.json (1 hunks)
  • packages/server/src/handlers/chat.ts (1 hunks)
  • packages/server/src/index.ts (1 hunks)
  • packages/server/src/utils/hitl/events.ts (1 hunks)
  • packages/server/src/utils/hitl/index.ts (1 hunks)
  • packages/server/src/utils/hitl/pause.ts (1 hunks)
  • packages/server/src/utils/hitl/resume.ts (1 hunks)
  • packages/server/src/utils/hitl/snapshot.ts (1 hunks)
  • packages/server/src/utils/stream.ts (4 hunks)
  • packages/server/src/utils/workflow.ts (2 hunks)
✅ Files skipped from review due to trivial changes (5)
  • .changeset/eight-moons-perform.md
  • packages/server/examples/hitl/index.ts
  • packages/server/package.json
  • packages/server/src/utils/hitl/index.ts
  • packages/server/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • packages/server/src/utils/stream.ts
  • packages/server/src/handlers/chat.ts
🧰 Additional context used
🧬 Code Graph Analysis (6)
packages/server/src/utils/hitl/pause.ts (2)
packages/server/src/utils/hitl/snapshot.ts (2)
  • ensureSnapshotWorkflowContext (69-78)
  • saveSnapshot (29-42)
packages/server/src/utils/hitl/events.ts (1)
  • humanResponseEvent (26-26)
packages/server/src/utils/hitl/resume.ts (2)
packages/server/src/utils/hitl/events.ts (1)
  • HumanResponseEventData (22-22)
packages/server/src/utils/hitl/snapshot.ts (3)
  • SnapshotWorkflowContext (11-13)
  • ensureSnapshotWorkflow (60-67)
  • loadSnapshot (44-58)
packages/server/next/app/api/chat/route.ts (6)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
  • Message (6-10)
packages/server/src/utils/workflow.ts (2)
  • runWorkflow (30-63)
  • processWorkflowStream (65-111)
packages/server/examples/hitl/src/app/workflow.ts (1)
  • workflowFactory (35-114)
packages/server/src/utils/hitl/events.ts (1)
  • getHumanResponsesFromMessage (29-38)
packages/server/src/utils/stream.ts (1)
  • toDataStream (42-94)
packages/server/src/utils/hitl/pause.ts (1)
  • pauseForHumanInput (7-20)
packages/server/examples/hitl/src/app/workflow.ts (4)
packages/server/src/utils/hitl/events.ts (3)
  • humanInputEventSchema (5-8)
  • humanInputEvent (15-15)
  • humanResponseEvent (26-26)
packages/server/examples/hitl/src/app/tools.ts (1)
  • cliExecutor (5-20)
packages/server/src/utils/events.ts (1)
  • toAgentRunEvent (72-93)
packages/server/src/utils/stream.ts (1)
  • writeResponseToStream (96-115)
packages/server/src/utils/workflow.ts (2)
packages/server/src/utils/hitl/events.ts (1)
  • HumanResponseEventData (22-22)
packages/server/src/utils/hitl/resume.ts (1)
  • resumeWorkflowFromHumanResponses (10-28)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
  • Message (6-10)
🪛 LanguageTool
packages/server/examples/hitl/README.md

[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

⏰ Context from checks skipped due to timeout of 90000ms (55)
  • GitHub Check: build
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: lint
🔇 Additional comments (8)
packages/server/src/utils/hitl/pause.ts (1)

15-16: Verify request(humanResponseEvent) call signature

If request expects (eventDef, data?), passing only the event definition will raise at runtime.
Please double-check the expected overload; add an explicit undefined or payload if needed.

packages/server/src/utils/hitl/resume.ts (1)

24-26: Potential missing await for resume

If snapshotWorkflow.resume() is asynchronous (common in IO-heavy workflows), omitting await will return a pending Promise instead of a ready context.

-  const context = snapshotWorkflow.resume(humanResponses, snapshot);
+  const context = await snapshotWorkflow.resume(humanResponses, snapshot);

Please confirm the method’s return type and adjust accordingly.

packages/server/examples/hitl/components/cli_human_input.tsx (1)

35-50: editableCommand may be undefined – ensure non-null data in payload

handleConfirm sends { command: editableCommand }, which may be undefined if editing is never enabled. Use the original command to avoid downstream parsing issues.

-            command: editableCommand,
+            command: inputEvent.command,
packages/server/src/utils/workflow.ts (1)

65-65: LGTM! Export addition enables proper stream processing.

Making processWorkflowStream explicitly exported is appropriate for the HITL implementation.

packages/server/next/app/api/chat/route.ts (2)

51-58: LGTM! Proper integration of HITL functionality.

The updated runWorkflow call correctly uses the new signature and properly handles human responses and snapshot restoration.


67-67: 🛠️ Refactor suggestion

Consider making the snapshot ID generation more robust.

Currently using requestId directly for snapshot identification. If requestId is undefined, the snapshot will be saved with "undefined" as the filename.

Consider providing a fallback:

- onPauseForHumanInput: () => pauseForHumanInput(context, requestId),
+ onPauseForHumanInput: () => pauseForHumanInput(context, requestId || undefined),

The pauseForHumanInput function already generates a UUID if no ID is provided, so passing undefined explicitly would be cleaner than passing the string "undefined".

Likely an incorrect or invalid review comment.

packages/server/src/utils/hitl/events.ts (1)

29-38:

❓ Verification inconclusive

Verify that the Message type includes an annotations field.

The function accesses message.annotations but the Message interface shown in relevant snippets doesn't include this field.

Let me verify the Message type definition:


🏁 Script executed:

#!/bin/bash
# Search for Message type definition with annotations field
ast-grep --pattern 'interface Message {
  $$$
}'

# Also search for any type extensions
rg -A 10 "interface.*Message|type.*Message.*=" --type ts

Length of output: 1543


🏁 Script executed:

#!/bin/bash
# Locate the events.ts file(s)
files=$(fd 'events.ts' -e ts)

echo "Found events.ts files:"
echo "$files"
echo

# For each file, print the imports and the function definition
for f in $files; do
  echo "===== $f ====="
  sed -n '1,200p' "$f"
  echo
done

Length of output: 12639


🏁 Script executed:

#!/usr/bin/env bash
# Search for any TypeScript module augmentations for "ai"
rg "declare module .*['\"]ai['\"]" -n .

Length of output: 45


Confirm that ai’s Message type declares annotations
Please verify that the Message interface you’re importing from the ai package includes an optional annotations field (e.g. annotations?: unknown[]). Without this property, the call to message.annotations?.filter(...) will not type‐check.

  • Inspect node_modules/ai/index.d.ts (or equivalent) for the Message declaration and confirm annotations is present and correctly typed.
packages/server/src/utils/hitl/snapshot.ts (1)

60-78: LGTM! Well-implemented type guards with clear error messages.

The type guard functions properly check for the required methods and provide helpful error messages when the workflow doesn't support snapshots.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
packages/server/src/utils/hitl/events.ts (1)

11-17: 🛠️ Refactor suggestion

⚠️ Potential issue

type discriminator should be a literal – keeps schema checks trivial

HumanInputEventData.type is declared as string, whereas the corresponding response event uses the literal "human_response".
Leaving it wide-open forces every consumer to pattern-match on additional fields or trust ad-hoc strings, defeating the purpose of a discriminator and breaking the nice schema.safeParse pattern you use below.

-export type HumanInputEventData = {
-  type: string;
-  data?: any;
-  response: WorkflowEvent<HumanResponseEventData>;
-};
+export type HumanInputEventData = {
+  type: "human_input";
+  /** Optional indicator for concrete UI component  */
+  component?: string;
+  data?: unknown;
+  response: WorkflowEvent<HumanResponseEventData>;
+};

Update any callers that relied on arbitrary type values to use the new component (or another dedicated) property instead.

🧹 Nitpick comments (2)
packages/server/src/utils/hitl/events.ts (2)

40-43: Dead-code? BaseEventGet is exported but never referenced

BaseEventGet is defined yet unused in this module. If nothing imports it elsewhere, drop the type alias to avoid API noise.


44-68: Potential unbounded growth of derivedEvents set

workflowBaseEvent stores every derived event ever created in derivedEvents, which lives for the lifetime of the process. Long-running servers that generate events dynamically (e.g., per request) will leak memory.

Consider one of:

-const derivedEvents = new Set<WorkflowEvent<any>>();
+// WeakSet lets GC collect unused events, preventing leaks
+const derivedEvents = new WeakSet<WorkflowEvent<any>>();

or expose a dispose()/clear() API if event factories are short-lived.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 7e4c68b and 6a5db05.

📒 Files selected for processing (7)
  • packages/server/examples/hitl/src/app/events.ts (1 hunks)
  • packages/server/examples/hitl/src/app/workflow.ts (1 hunks)
  • packages/server/next/app/api/chat/route.ts (3 hunks)
  • packages/server/src/handlers/chat.ts (1 hunks)
  • packages/server/src/utils/hitl/events.ts (1 hunks)
  • packages/server/src/utils/hitl/pause.ts (1 hunks)
  • packages/server/src/utils/stream.ts (4 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
  • packages/server/examples/hitl/src/app/events.ts
  • packages/server/examples/hitl/src/app/workflow.ts
  • packages/server/src/utils/hitl/pause.ts
  • packages/server/next/app/api/chat/route.ts
  • packages/server/src/utils/stream.ts
  • packages/server/src/handlers/chat.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
  • Message (6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: lint

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
packages/server/examples/hitl/README.md (2)

1-1: Refine wording for clarity and grammar.

The article “the” and hyphens improve readability and correctness:

- This example demonstrates human in the loop workflow. Export OpenAI API key and start the server in dev mode.
+ This example demonstrates the human-in-the-loop workflow. To run the example, export your OpenAI API key and start the server in development mode.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)


3-6: Add directory navigation step.

Users need to cd into this example before running commands. For instance:

```bash
+ cd packages/server/examples/hitl
  export OPENAI_API_KEY=<your-openai-api-key>
  npx nodemon --exec tsx index.ts --ignore output/*

</blockquote></details>

</blockquote></details>

<details>
<summary>📜 Review details</summary>

**Configuration used: .coderabbit.yaml**
**Review profile: CHILL**
**Plan: Pro**


<details>
<summary>📥 Commits</summary>

Reviewing files that changed from the base of the PR and between 8f107f575fbe12d9e91e801858772450996ebdc1 and 2c062c924a9fed053c158a6cee02e14f0a8b3de2.

</details>

<details>
<summary>📒 Files selected for processing (1)</summary>

* `packages/server/examples/hitl/README.md` (1 hunks)

</details>

<details>
<summary>🧰 Additional context used</summary>

<details>
<summary>🪛 LanguageTool</summary>

<details>
<summary>packages/server/examples/hitl/README.md</summary>

[uncategorized] ~1-~1: You might be missing the article “the” here.
Context: ...ates human in the loop workflow. Export OpenAI API key and start the server in dev mod...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

</details>

</details>

</details>

<details>
<summary>⏰ Context from checks skipped due to timeout of 90000ms (58)</summary>

* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
* GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
* GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
* GitHub Check: lint
* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
* GitHub Check: Unit Tests (ubuntu-latest, 3.9)
* GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
* GitHub Check: Unit Tests (windows-latest, 3.9)
* GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
* GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
* GitHub Check: Type Check
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
* GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
* GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
* GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
* GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)

</details>

</details>

<!-- This is an auto-generated comment by CodeRabbit for review status -->

@thucpn thucpn requested a review from marcusschiesser June 12, 2025 03:13
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (5)
packages/server/examples/hitl/README.md (5)

1-1: Clarify Human-in-the-Loop terminology
Consider hyphenating and introducing the acronym on the main title for consistency.

Example:

-# Human in the Loop
+# Human-in-the-Loop (HITL)

9-13: Add dependency installation step
Currently the instructions skip installing dependencies or entering the example directory. Include steps to install packages and switch to the example folder.

Example:

 ## Getting Started
 
-### Environment Setup
+### Prerequisites & Environment Setup
+
+- Navigate to the example directory:
+```bash
+cd packages/server/examples/hitl
+```
+
-Export your OpenAI API key:
+Install dependencies:
+```bash
+npm install
+```
+
+Export your OpenAI API key:
 ```bash
 export OPENAI_API_KEY=<your-openai-api-key>

---

`27-29`: **Specify code block language for URL**  
The fenced code block containing the URL lacks a language specifier. Add one to satisfy markdownlint.  
  
```diff
-```
-http://localhost:3000
-```
+```text
+http://localhost:3000
+```
🧰 Tools
🪛 markdownlint-cli2 (0.17.2)

27-27: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


64-64: Remove unnecessary comma
The comma before "that" is superfluous.

-HITL also needs a custom UI component, that is shown when the LlamaIndexServer receives the `cliHumanInputEvent`.
+HITL also needs a custom UI component that is shown when the LlamaIndexServer receives the `cliHumanInputEvent`.

84-84: Add missing comma for compound sentence
Insert a comma before "and" to connect the independent clauses.

-This component displays the command to execute and the user can choose to execute or cancel the command execution.
+This component displays the command to execute, and the user can choose to execute or cancel the command execution.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~84-~84: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...

(COMMA_COMPOUND_SENTENCE_2)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2c062c9 and c5c72f5.

📒 Files selected for processing (3)
  • packages/server/README.md (1 hunks)
  • packages/server/examples/hitl/README.md (1 hunks)
  • packages/server/examples/tsconfig.json (1 hunks)
✅ Files skipped from review due to trivial changes (2)
  • packages/server/README.md
  • packages/server/examples/tsconfig.json
🧰 Additional context used
🪛 LanguageTool
packages/server/examples/hitl/README.md

[style] ~39-~39: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...

(DO_ACHIEVE)


[uncategorized] ~44-~44: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[uncategorized] ~84-~84: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...

(COMMA_COMPOUND_SENTENCE_2)


[uncategorized] ~141-~141: A punctuation mark might be missing here.
Context: ...ls The CLI executor tool is defined in tools.ts: ```types...

(AI_EN_LECTOR_MISSING_PUNCTUATION)

🪛 markdownlint-cli2 (0.17.2)
packages/server/examples/hitl/README.md

27-27: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: lint

@thucpn thucpn requested a review from marcusschiesser June 12, 2025 03:55
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (2)
packages/server/src/utils/hitl/events.ts (2)

10-14: Discriminator should be a fixed literal, not string
HumanInputEventData.type is still declared as string, making it impossible to discriminate reliably against other event kinds. Please narrow it to a literal value (e.g. "human_input") to mirror HumanResponseEventData.


26-34: message.annotations is not on ai.Message; add augmentation or local type
The upstream Message interface exposes only id, content, and role. Accessing annotations will either produce a TS error or devolve to any. Provide a module augmentation or introduce a local ExtendedMessage type that includes annotations?: unknown[].

🧹 Nitpick comments (1)
packages/server/src/utils/hitl/events.ts (1)

27-27: Prefer z.unknown() over z.any() for stronger type safety

-  const schema = z.object({ type: z.literal("human_response"), data: z.any() });
+  const schema = z.object({ type: z.literal("human_response"), data: z.unknown() });

Using unknown forces downstream consumers to perform explicit validation instead of silently accepting anything.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c5c72f5 and 66b8db6.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (4)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (1 hunks)
  • packages/server/package.json (2 hunks)
  • packages/server/project-config/package.json (1 hunks)
  • packages/server/src/utils/hitl/events.ts (1 hunks)
✅ Files skipped from review due to trivial changes (2)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
  • packages/server/project-config/package.json
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/server/package.json
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils/hitl/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
  • Message (6-10)
⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: lint
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)

@thucpn thucpn mentioned this pull request Jun 12, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (15)
packages/create-llama/helpers/typescript.ts (1)

86-92: Consider pre-computing a Set for O(1) look-ups

NO_DATA_USE_CASES.includes(useCase) is fine for a 3-element array, but the list is growing. Turning the array into a Set in constant.ts would avoid repeated linear searches and conveys intent that values are unique.

- export const NO_DATA_USE_CASES: TemplateUseCase[] = [
-   "code_generator",
-   "document_generator",
-   "hitl",
- ];
+ export const NO_DATA_USE_CASES = new Set<TemplateUseCase>([
+   "code_generator",
+   "document_generator",
+   "hitl",
+ ]);

and here:

- if (useCase && NO_DATA_USE_CASES.includes(useCase)) {
+ if (useCase && NO_DATA_USE_CASES.has(useCase)) {
packages/create-llama/helpers/index.ts (1)

103-104: Minor readability nit – collapse boolean

const shouldRunGenerate = !useCase || !NO_DATA_USE_CASES.includes(useCase); is correct but a little dense.
Consider inlining the check or extracting a helper for clarity, e.g.:

const isDatafulUseCase = !useCase || !NO_DATA_USE_CASES.includes(useCase);
if (isDatafulUseCase) { ... }
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (1)

24-30: Avoid manual list maintenance

useCases now repeats the same literals held in NO_DATA_USE_CASES. To keep tests in sync automatically:

-const useCases: TemplateUseCase[] = [
-  "agentic_rag",
-  ...
-  "document_generator",
-  "hitl",
-];
+const useCases: TemplateUseCase[] = [
+  "agentic_rag",
+  "deep_research",
+  "financial_report",
+  ...NO_DATA_USE_CASES,
+];

This eliminates the risk of forgetting to add future no-data use cases in two places.

packages/create-llama/questions/simple.ts (3)

15-17: Growing union hints at maintainability pressure

The manual AppType union keeps expanding. You might consider deriving it from the keys of lookup or from NO_DATA_USE_CASES to avoid divergent definitions.


62-67: Choice list duplication

The choices array duplicates the literals in AppType and NO_DATA_USE_CASES. Same maintainability concern as above—centralise to one source where feasible.


182-187: Lookup entry added – consider DRYing repeated blocks

code_generator, document_generator, and hitl share identical entries. Consolidating via a helper or spreading a base object can cut noise.

const BASE_NO_DATA: Pick<...> = {
  template: "llamaindexserver",
  dataSources: [],
  tools: [],
  modelConfig: MODEL_GPT41,
};

code_generator: BASE_NO_DATA,
document_generator: BASE_NO_DATA,
hitl: BASE_NO_DATA,
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts (1)

5-8: Name clash risk

name: "cli_executor" is exported from a template package as a global tool name. If multiple projects load these templates side-by-side, name collisions may occur. Consider prefixing with the package or workflow id (hitl_cli_executor).

packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx (1)

75-80: Disabled input has onChange – dead code

Because the <input> is disabled, onChange never fires and editableCommand can’t actually be edited.
• If edits should be allowed, remove disabled.
• If not, drop the onChange handler and local state altogether.

packages/create-llama/templates/components/use-cases/python/hitl/events.py (1)

8-13: Expose stderr to the workflow

CLIHumanResponseEvent captures only execute and command. Consider adding optional stderr / exit_code so the workflow can detect failures.

🧰 Tools
🪛 Pylint (3.3.7)

[refactor] 8-8: Too few public methods (0/2)

(R0903)

packages/create-llama/templates/components/use-cases/python/hitl/README-template.md (1)

26-29: Minor markdown/style fixes

Add a language tag to the fenced block and fix the article typo:

-```
+```bash
 ...
-You can start by sending an request
+You can start by sending a request
🧰 Tools
🪛 markdownlint-cli2 (0.17.2)

26-26: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md (1)

27-31: Incorrect file reference & article typo

workflow.py is a Python file – the TS template probably wants workflow.ts. Also fix “an request”.

-To update the workflow, you can modify the code in [`workflow.py`](app/workflow.py).
+To update the workflow, you can modify the code in [`workflow.ts`](src/app/workflow.ts).

-You can start by sending an request
+You can start by sending a request
🧰 Tools
🪛 LanguageTool

[uncategorized] ~27-~27: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code in workflow.py. You can...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[misspelling] ~29-~29: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...

(EN_A_VS_AN)

🪛 markdownlint-cli2 (0.17.2)

31-31: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts (2)

58-66: Rename to avoid double-data confusion

Inside the handler, data.data is easy to misread. Destructure once and rename:

-const { command, execute } = data.data;
+const { command, execute } = data; // `data` is already the payload

This keeps the API self-explanatory.


82-88: Avoid mutating user-supplied array

chatHistory.push(...) mutates the original messages array captured from the request.
Clone first to keep function-local state pure:

-const chatHistory = messages;
+const chatHistory = [...messages];
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py (2)

51-55: Compact OS check

Minor readability win:

cli_language = "bash" if platform.system() in ("Linux", "Darwin") else "cmd"
🧰 Tools
🪛 Ruff (0.11.9)

52-55: Use ternary operator cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd" instead of if-else-block

Replace if-else-block with cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd"

(SIM108)

🪛 Pylint (3.3.7)

[refactor] 52-52: Consider merging these comparisons with 'in' by using 'os_name in ('Linux', 'Darwin')'. Use a set instead if elements are hashable.

(R1714)


80-87: Drop unreachable else block

The return inside the if ev.execute branch already exits the function.
Removing the extra else reduces indentation noise.

-if ev.execute:
-    ...
-    return StopEvent(result=res.stdout or res.stderr)
-else:
-    return StopEvent(result=None)
+if not ev.execute:
+    return StopEvent(result=None)
+...
+return StopEvent(result=res.stdout or res.stderr)
🧰 Tools
🪛 Pylint (3.3.7)

[refactor] 80-87: Unnecessary "else" after "return", remove the "else" and de-indent the code inside it

(R1705)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 7ee59c5 and 5b06106.

📒 Files selected for processing (17)
  • .changeset/eight-moons-perform.md (1 hunks)
  • packages/create-llama/e2e/python/resolve_dependencies.spec.ts (1 hunks)
  • packages/create-llama/e2e/shared/llamaindexserver_template.spec.ts (1 hunks)
  • packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (3 hunks)
  • packages/create-llama/helpers/constant.ts (1 hunks)
  • packages/create-llama/helpers/index.ts (2 hunks)
  • packages/create-llama/helpers/types.ts (1 hunks)
  • packages/create-llama/helpers/typescript.ts (2 hunks)
  • packages/create-llama/questions/simple.ts (5 hunks)
  • packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx (1 hunks)
  • packages/create-llama/templates/components/use-cases/python/hitl/README-template.md (1 hunks)
  • packages/create-llama/templates/components/use-cases/python/hitl/events.py (1 hunks)
  • packages/create-llama/templates/components/use-cases/python/hitl/workflow.py (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts (1 hunks)
✅ Files skipped from review due to trivial changes (5)
  • packages/create-llama/helpers/constant.ts
  • packages/create-llama/e2e/shared/llamaindexserver_template.spec.ts
  • packages/create-llama/helpers/types.ts
  • packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts
  • packages/create-llama/e2e/python/resolve_dependencies.spec.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • .changeset/eight-moons-perform.md
🧰 Additional context used
🧬 Code Graph Analysis (6)
packages/create-llama/helpers/index.ts (1)
packages/create-llama/helpers/constant.ts (1)
  • NO_DATA_USE_CASES (11-15)
packages/create-llama/helpers/typescript.ts (1)
packages/create-llama/helpers/constant.ts (1)
  • NO_DATA_USE_CASES (11-15)
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (1)
packages/create-llama/helpers/constant.ts (1)
  • NO_DATA_USE_CASES (11-15)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/workflow.ts (4)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/tools.ts (1)
  • cliExecutor (5-20)
packages/create-llama/templates/components/use-cases/typescript/hitl/src/app/events.ts (2)
  • cliHumanInputEvent (3-7)
  • cliHumanResponseEvent (9-12)
packages/server/src/utils/events.ts (1)
  • toAgentRunEvent (72-93)
packages/server/src/utils/stream.ts (1)
  • writeResponseToStream (101-120)
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py (1)
packages/create-llama/templates/components/use-cases/python/hitl/events.py (3)
  • CLICommand (15-16)
  • CLIHumanInputEvent (20-34)
  • CLIHumanResponseEvent (8-12)
packages/create-llama/questions/simple.ts (1)
packages/create-llama/helpers/constant.ts (1)
  • NO_DATA_USE_CASES (11-15)
🪛 Pylint (3.3.7)
packages/create-llama/templates/components/use-cases/python/hitl/events.py

[refactor] 8-8: Too few public methods (0/2)

(R0903)


[refactor] 15-15: Too few public methods (0/2)

(R0903)


[refactor] 20-20: Too few public methods (0/2)

(R0903)

packages/create-llama/templates/components/use-cases/python/hitl/workflow.py

[refactor] 52-52: Consider merging these comparisons with 'in' by using 'os_name in ('Linux', 'Darwin')'. Use a set instead if elements are hashable.

(R1714)


[refactor] 80-87: Unnecessary "else" after "return", remove the "else" and de-indent the code inside it

(R1705)

🪛 LanguageTool
packages/create-llama/templates/components/use-cases/python/hitl/README-template.md

[uncategorized] ~38-~38: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code in workflow.py. You can...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[misspelling] ~40-~40: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...

(EN_A_VS_AN)


[style] ~54-~54: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...

(DO_ACHIEVE)


[uncategorized] ~59-~59: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[uncategorized] ~99-~99: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...

(COMMA_COMPOUND_SENTENCE_2)

packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md

[uncategorized] ~27-~27: A punctuation mark might be missing here.
Context: ...he workflow, you can modify the code in workflow.py. You can...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[misspelling] ~29-~29: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...workflow.py). You can start by sending an request on the [chat UI](http://localho...

(EN_A_VS_AN)


[style] ~43-~43: Consider a more expressive alternative.
Context: ...before proceeding to the next step. To do this, you will need to implement two cu...

(DO_ACHIEVE)


[uncategorized] ~48-~48: A punctuation mark might be missing here.
Context: ... implemented these two custom events in events.ts: - `cliH...

(AI_EN_LECTOR_MISSING_PUNCTUATION)


[uncategorized] ~88-~88: Use a comma before “and” if it connects two independent clauses (unless they are closely connected and short).
Context: ...omponent displays the command to execute and the user can choose to execute or cance...

(COMMA_COMPOUND_SENTENCE_2)

🪛 markdownlint-cli2 (0.17.2)
packages/create-llama/templates/components/use-cases/python/hitl/README-template.md

26-26: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


42-42: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

packages/create-llama/templates/components/use-cases/typescript/hitl/README-template.md

7-7: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


13-13: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


31-31: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/hitl/workflow.py

52-55: Use ternary operator cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd" instead of if-else-block

Replace if-else-block with cli_language = "bash" if os_name == "Linux" or os_name == "Darwin" else "cmd"

(SIM108)

⏰ Context from checks skipped due to timeout of 90000ms (55)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (7)
packages/create-llama/helpers/typescript.ts (1)

7-7: Good move – centralised constant import

Replacing hard-coded literals with the shared NO_DATA_USE_CASES constant increases maintainability and removes duplication.

packages/create-llama/helpers/index.ts (1)

7-7: 👍 Shared constant keeps logic DRY

Importing NO_DATA_USE_CASES here removes magic strings and aligns behaviour with the other helpers.

packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2)

6-6: Consistent source of truth for exclusions – nice

Pulling NO_DATA_USE_CASES into the test reduces drift between prod code and tests.


88-89: LGTM – dynamic skip uses shared constant
The new condition correctly prevents LlamaParse runs for hitl / other no-data use cases.

packages/create-llama/questions/simple.ts (2)

2-2: 👍 Centralised constant import

Keeps the questionnaire logic aligned with helper rules.


92-94: Logic is correct – name conveys intent
shouldAskLlamaCloud cleanly expresses the new rule based on the shared constant.

packages/create-llama/templates/components/ui/use-cases/hitl/cli_human_input.tsx (1)

30-34: .at(-1) requires Node 18 / modern browsers

.at() is Stage-4 but still absent in some LTS environments. If broader compatibility is a goal, replace with arr[arr.length - 1].

@thucpn thucpn merged commit e2486eb into main Jun 12, 2025
60 checks passed
@coderabbitai coderabbitai bot mentioned this pull request Jun 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants