ls1intum / Artemis

Artemis - Interactive Learning with Automated Feedback
https://docs.artemis.cit.tum.de
MIT License
479 stars 286 forks source link

Create learning analytics metrics for exercise timings #8195

Closed bassner closed 2 weeks ago

bassner commented 5 months ago

Context

Within the HybridD3 project, we're investigating the influence of presenting statistics and metrics directly to learners versus having an AI LLM dialogue about their performance. Initial steps for tracking user events like exercise openings and lecture material accesses have been established in #7680 .

Problem

The "science" subsystem now tracks new user events, but we're not yet using this data to generate advanced statistics. Furthermore, the current metrics do not fully meet the project partners' requirements from the education department for our experiment.

Desired User Metrics

Exercise Metrics

  1. Time Difference Between Last Submission and Exercise Due Date: Measure how late students complete assignments relative to the due date.

  2. Time Difference Between Exercise Start and Last Submission: Assess the duration students actually spend on exercises from start to completion.

For the above metrics, provide averages for exercises grouped by the same due date day, in both absolute values (seconds) and relative values (% of time between release and due date).

Weekly Activity Metrics

  1. Number of Submissions per Exercise: Track the weekly total of submissions, grouped by exercise ID.

  2. Number of Exercises Submitted to: Count how many different exercises a student submits per week, both as an absolute number and relative to the total number of active exercises that week. active = start date <= current date <= due date.

Lecture Unit Metrics

  1. Weekly Lecture Material Access: Calculate the weekly number of accesses to lecture unit materials, grouped by unit type (e.g., PDF, video).

Data Access

To support both direct student feedback and integration with AI-mediated dialogue systems, the metrics should be accessible through:

This approach ensures that the data can efficiently serve both direct student engagement and our future course level IRIS chat pipeline.

LFGUzL commented 5 months ago

Context

Within the HybridD3 project, we're investigating the influence of presenting statistics and metrics directly to learners versus having an AI LLM dialogue about their performance. Initial steps for tracking user events like exercise openings and lecture material accesses have been established in #7680.

Problem

The "science" subsystem now tracks new user events, but we're not yet using this data to generate advanced statistics. Furthermore, the current metrics do not fully meet the project partners' requirements from the education department for our experiment.

Desired User Metrics

Exercise Metrics

  1. Time Difference Between Last Submission and Exercise Due Date: Measure how late students complete assignments relative to the due date.

  2. Time Difference Between Exercise Start and Last Submission: Assess the duration students actually spend on exercises from start to completion.

  3. Time Difference Between Release Date and Exercise Start: Assess the time students wait to start an exercise.

For the above metrics, provide averages for exercises grouped by the same start and due date day, in both absolute values (seconds) and relative values (% of time between release and due date).

Weekly Activity Metrics

  1. Number of Submissions per Exercise: Track the total of submissions, grouped by exercise ID.

  2. Number of Exercises Submitted weekly: Count how many different exercises a student submits per week, both as an absolute number and relative to the total number of active exercises that week. active = start date <= current date <= due date.

For each of the metrics above, make a differentiation for a) difficulty level and b) type of the exercise. Also, make a filter for final submissions that have <50% answered or <30% correct if it is auto-graded.

  1. Gaps between activity: Count for meaningful bins (e.g. <1min, <3min, <5min, <15min, <30min, <1h, <2h, <4h, <16h, <36h, <2d, <3d, <4d, <5d, <7d, <12d, <16d) what time differences were between a) submissions, b) final submissions of an exercise of a student

Lecture Unit Metrics

  1. Weekly Lecture Material Access: Calculate the weekly number of accesses to lecture unit materials, grouped by unit type (e.g., PDF, video).

(if possible) Track content Access after initial open Track when and how long a tab with the content (the pdf or video or task) was actively viewed.

  1. Viewing Learning Analytics Dashboard: Track, when student interacted (e.g scrolled for >3s in an interval of 1min) with the Homepage where the Graphs or the LLM Chat are displayed (optional)

Data Access

To support both direct student feedback and integration with AI-mediated dialogue systems, the metrics should be accessible through:

This approach ensures that the data can efficiently serve both direct student engagement and our future course level IRIS chat pipeline.