Skip to content

Conversation

@EugeneOYZ1203n
Copy link

@EugeneOYZ1203n EugeneOYZ1203n commented Apr 9, 2025

Description

Creates a generate AI comments field in the Grading editor and workspace, also includes fields to enable LLM Grading and add a course specific API key in the course configuration panels.

Type of change

  • Added a AI Comment generation box in the Grading Editor and Workspace for Programming questions
  • Added enable LLM Grading and LLM API Key fields to course configuration with buttons in the Dropdown Create Course UI and in the Course Configuration Panel
  • Added 3 functions in Request Saga to link to backend

How to test

Nil. I don't know how to test frontend, do advise on what I should include.

Checklist

Do advise as well on how to update the documentation. Thanks

  • I have tested this code
  • I have updated the documentation

@coveralls
Copy link

coveralls commented Apr 17, 2025

Pull Request Test Coverage Report for Build 18897741900

Details

  • 36 of 348 (10.34%) changed or added relevant lines in 12 files are covered.
  • 6 unchanged lines in 4 files lost coverage.
  • Overall coverage decreased (-0.2%) to 42.274%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/commons/assessment/AssessmentTypes.ts 0 1 0.0%
src/commons/mocks/BackendMocks.ts 0 5 0.0%
src/commons/sagas/BackendSaga.ts 0 5 0.0%
src/commons/dropdown/DropdownCreateCourse.tsx 19 27 70.37%
src/pages/academy/grading/subcomponents/GradingWorkspace.tsx 0 17 0.0%
src/pages/academy/adminPanel/AdminPanel.tsx 0 18 0.0%
src/pages/academy/grading/subcomponents/GradingCommentSelector.tsx 0 25 0.0%
src/commons/sagas/RequestsSaga.ts 4 39 10.26%
src/pages/academy/adminPanel/subcomponents/CourseConfigPanel.tsx 1 90 1.11%
src/pages/academy/grading/subcomponents/GradingEditor.tsx 0 109 0.0%
Files with Coverage Reduction New Missed Lines %
src/pages/academy/adminPanel/subcomponents/CourseConfigPanel.tsx 1 2.99%
src/pages/academy/grading/subcomponents/GradingEditor.tsx 1 0.0%
src/commons/sagas/RequestsSaga.ts 2 19.78%
src/pages/academy/adminPanel/AdminPanel.tsx 2 0.0%
Totals Coverage Status
Change from base Build 18878649597: -0.2%
Covered Lines: 21141
Relevant Lines: 52395

💛 - Coveralls

@RichDom2185 RichDom2185 self-requested a review April 17, 2025 19:51
Copy link
Member

@RichDom2185 RichDom2185 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a backend PR to accompany this frontend PR?

@RichDom2185 RichDom2185 added the blocked Something else needs pass review first label Apr 17, 2025
@EugeneOYZ1203n
Copy link
Author

Is there a backend PR to accompany this frontend PR?

This is the accompanying backend PR:
source-academy/backend#1248

@Tkaixiang Tkaixiang self-assigned this Sep 8, 2025
@RichDom2185 RichDom2185 requested a review from Copilot October 28, 2025 09:59
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR implements AI-powered marking capabilities for programming questions, allowing instructors to generate grading comments using LLM technology. The changes introduce LLM configuration options at the course level and integrate comment generation into the grading workflow.

Key Changes

  • Added LLM comment generation UI in the Grading Editor and Workspace with a comment selector component
  • Introduced course-level LLM configuration fields (API key, model, URL, and course-level prompt) in both admin panel and course creation flow
  • Created backend integration functions for generating AI comments and saving final feedback

Reviewed Changes

Copilot reviewed 18 out of 18 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
src/pages/academy/grading/subcomponents/GradingEditor.tsx Adds AI comment generation dialog, prompt viewing, and comment selector integration
src/pages/academy/grading/subcomponents/GradingCommentSelector.tsx New component for displaying and selecting from LLM-generated comment suggestions
src/pages/academy/grading/subcomponents/GradingWorkspace.tsx Passes LLM-related props to GradingEditor for programming questions
src/pages/academy/adminPanel/subcomponents/CourseConfigPanel.tsx Adds LLM configuration UI with fields for API key, model, URL, and course prompt
src/pages/academy/adminPanel/AdminPanel.tsx Includes LLM configuration in default course config and state synchronization
src/commons/dropdown/DropdownCreateCourse.tsx Adds LLM grading toggle and API key field to course creation dialog
src/commons/sagas/RequestsSaga.ts Implements API calls for generating comments and saving final feedback
src/commons/sagas/BackendSaga.ts Updates grading submission to include LLM enablement flag
src/features/grading/GradingTypes.ts Extends grading types to support AI comments and LLM assessment prompts
src/commons/assessment/AssessmentTypes.ts Adds LLM prompt field to programming question type
src/commons/application/types/SessionTypes.ts Defines LLM-related session state and course configuration types
src/pages/localStorage.ts Persists LLM configuration in local storage
src/styles/_workspace.scss Styles for grading comment selector component
src/styles/_academy.scss Styles for LLM prompt dialog and configuration sections
src/commons/mocks/GradingMocks.ts Updates mock data with LLM-related fields
src/commons/mocks/BackendMocks.ts Updates mock backend to include LLM grading flag
src/commons/application/reducers/tests/SessionReducer.test.ts Adds LLM fields to test data
src/commons/application/actions/tests/SessionActions.test.ts Updates test expectations for grading actions
Comments suppressed due to low confidence (1)

src/pages/academy/grading/subcomponents/GradingEditor.tsx:1

  • Use strict equality operator (===) instead of loose equality (==) for type comparison.
import {

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@RichDom2185
Copy link
Member

@sentry review

}
}

.llm-grading-config {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: We should avoid using global styles and move to scoped CSS modules (files ending in .module.css/.module.scss)

}

.llm-grading-config {
.bp6-form-group {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Abstraction violation. Use $ns so that if the Blueprint version changes, this style won't break

}
}

.grading-comment-selector {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ditto on CSS modules

enableSourcecast: state.session.enableSourcecast,
enableStories: state.session.enableStories,
enableLlmGrading: state.session.enableLlmGrading,
llmApiKey: state.session.llmApiKey,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we storing in frontend? Isn't this very dangerous?

Comment on lines +43 to +46
readonly llmApiKey?: string;
readonly llmModel?: string;
readonly llmApiUrl?: string;
readonly llmCourseLevelPrompt?: string;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why should these be under session (or stored in the frontend at all)

prepend: '',
solutionTemplate: '//This is a mock solution template',
postpend: '',
llm_prompt: null,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this needed? It's all done in the backend right?

Comment on lines +330 to +335
llm_course_level_prompt={llm_course_level_prompt || ''}
llm_assessment_prompt={grading!.assessment.llm_assessment_prompt}
llm_question_prompt={
(grading!.answers[questionId].question.type === 'programming' &&
grading!.answers[questionId].question.llm_prompt) ||
''
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of this shouldn't be in the frontend to avoid prompt injection. Just use the value (that is already stored in the backend anyway)

@RichDom2185
Copy link
Member

LLM Key should never be put in session state, let alone persisted, let alone in the frontend. Paste the plain text once upon course creation, it's sent to BE and encrypted forever (you can't view it again). You should not be able to view it again, only change it subsequently via the dialog.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

blocked Something else needs pass review first

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants