Back to Cases
Case ReportPublished curriculum / implementation paper2023
AAB-CASE-2026-RV-108

Ripple: Concept-Based Interpretation for Raw Time Series Models in Education

Time series is the most prevalent form of input data for edu- cational prediction tasks. The vast majority of research using time series data focuses on hand-crafted features, designed by experts for predictive performance and interpretability.

This page documents an AI literacy or AI education case for registry purposes. It is descriptive and does not imply AAB endorsement of any specific tool, provider, or intervention.
01

Implementation

Source publication / research team or educational organization described in paper

02

Learning context

Research / curriculum design context

03

AI role

Evaluator

04

Outcome signal

Conceptual understanding

Registry Facets

0
Education Level
  • Unspecified / broad education
Subject Area
  • AI for education
  • interpretability
  • ML concepts / supervised learning
  • Explainable AI / robustness
Use Case Type
  • Assessment support
Stakeholder Group
  • Students
AI Capability Type
  • ML concepts / supervised learning
  • Explainable AI / robustness
  • Assessment / tutoring analytics
Implementation Model
  • Research / curriculum design context
Evidence Type
  • Learning analytics
Outcomes Domain
  • Conceptual understanding
  • Assessment / feedback quality

Implementing Organization

1
Organization Type

Source publication / research team or educational organization described in paper

Location

Not specified in extracted text

Primary Facilitator Role

Researchers, educators, instructors, or facilitators as described in the source publication

Learning Context

2
Setting Type
  • Research / curriculum design context
Session Format

Classroom, course, or resource-based AI education activity

Duration

10 weeks

Group Size

our pipeline on a large educational data set including 23 MOOCs with over 100, 000 students and millions of in- teractions, addressing the following research questions: 1. Can we use raw time series as input and

Devices

ML concepts / supervised learning, Explainable AI / robustness, Assessment / tutoring analytics

Constraints
  • High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.

Learner Profile

3
Age Range

Unspecified / broad education

Prior AI Exposure Assumed

Mixed or not explicitly specified; infer from target learner group and intervention design.

Prior Programming Background Assumed

Varies by intervention; not specified unless the paper explicitly describes prerequisites.

Educational Intent

4
Primary Learning Goals
  • Document the AI education intervention, course, tool, or resource described in the source publication.
  • Extract the learner context, AI role, pedagogy, outcomes, and constraints for AAB registry comparison.
  • Time series is the most prevalent form of input data for edu- cational prediction tasks.
Secondary Learning Goals
  • Support AAB comparison across AI literacy, AI education, teacher training, higher education, and workforce contexts.
  • Capture evidence maturity, transferability, and limitations rather than treating the publication as product endorsement.
What This Was Not
  • Not an AAB endorsement of the tool, curriculum, provider, or result.
  • Not a direct replication record unless the source paper reports implementation details sufficient for replication.

AI Tool Description

5
Tool Type

ML concepts / supervised learning, Explainable AI / robustness, Assessment / tutoring analytics

Languages

Not specified in extracted text

AI Role
  • Evaluator
User Interaction Model
  • Primary interaction pattern inferred from publication: Assessment support.
  • AI capability focus: ML concepts / supervised learning, Explainable AI / robustness, Assessment / tutoring analytics.
Safeguards
  • Minimize personal data collection and avoid storing identifiable learner media unless approved by local policy/IRB.

Activity Design

6
Activity Flow
  • Review the publication’s reported context, learner group, AI tool or curriculum, implementation process, and outcome evidence.
  • Map the case to AAB registry fields for comparison across educational levels and AI capability types.
  • Use the source publication and PDF for any manual verification before public registry release.
Human Vs AI Responsibilities
  • Human educators/researchers remain responsible for instructional design, supervision, interpretation, and ethical safeguards.
  • AI systems or AI concepts provide the learning object, support tool, evaluator, simulator, or automation context depending on the paper.
Scaffolding Strategies
  • Instructional / curriculum-based learning
  • Registry extraction emphasizes explicit learning goals, observed outcomes, constraints, and safety limitations.

Observed Challenges

7
Educators Reported
  • High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.

Design Adaptations

8
Adaptations
  • Case classified under: Published curriculum / implementation paper.
  • Pedagogical pattern: Instructional / curriculum-based learning.
  • Any additional adaptations should be verified against the full paper before public-facing publication.

Reported Outcomes

9
Engagement
  • Engagement evidence should be interpreted according to the source paper’s reported method and sample.
  • The vast majority of research using time series data focuses on hand-crafted features, designed by experts for predictive performance and interpretability.
Learning Signals
  • The vast majority of research using time series data focuses on hand-crafted features, designed by experts for predictive performance and interpretability.
Educators Reflection

Time series is the most prevalent form of input data for edu- cational prediction tasks. The vast majority of research using time series data focuses on hand-crafted features, designed by experts for predictive performance and interpretability.

Ethical & Privacy Considerations

10
Privacy
  • Minimize personal data collection and avoid storing identifiable learner media unless approved by local policy/IRB.

Evidence Type

11
Evidence
  • Learning analytics

Relevance to Research

12
Potential Research Use
  • Can be used as an AAB evidence record for cross-case comparison, standards drafting, and evidence-maturity mapping.
  • Supports identification of recurring patterns in AI literacy, AI education implementation, teacher preparation, assessment, and responsible AI learning.
Relevant Research Domains
  • Conceptual understanding
  • Assessment / feedback quality
  • Assessment support
  • ML concepts / supervised learning
  • Explainable AI / robustness
  • Assessment / tutoring analytics

Case Status

13
Case Status
  • Completed

AAB Classification Tags

14
Age

Unspecified / broad education

Setting

Research / curriculum design context

AI Function

ML concepts / supervised learning, Explainable AI / robustness, Assessment / tutoring analytics

Pedagogy

Instructional / curriculum-based learning

Risk Level

High

Data Sensitivity

High

Source Publication

15
Title

Ripple: Concept-Based Interpretation for Raw Time Series Models in Education

Authors
  • Mohammad Asadi
  • Vinitra Swamy
  • Jibril Frej
  • Julien Vignoud
  • Mirko Marras
  • Tanja Käser
Venue

Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 37 No. 13, EAAI-23

Year

2023

Doi

10.1609/aaai.v37i13.26888

Source URL

https://ojs.aaai.org/index.php/AAAI/article/view/26888

Pdf URL

https://ojs.aaai.org/index.php/AAAI/article/view/26888/26660

Pdf Filename

079_Ripple_ Concept-Based Interpretation for Raw Time Series Models in Education.pdf

Page Count

9

Abstract

Time series is the most prevalent form of input data for edu- cational prediction tasks. The vast majority of research using time series data focuses on hand-crafted features, designed by experts for predictive performance and interpretability. However, extracting these features is labor-intensive for hu- mans and computers. In this paper, we propose an approach that utilizes irregular multivariate time series modeling with graph neural networks to achieve comparable or better accu- racy with raw time series clickstreams in comparison to hand- crafted features. Furthermore, we extend concept activation vectors for interpretability in raw time series models. We an- alyze these advances in the education domain, addressing the task of early student performance prediction for downstream targeted interventions and instructional support. Our exper- imental analysis on 23 MOOCs with millions of combined interactions over six behavioral dimensions show that models designed with our approach can (i) beat state-of-the-art edu- cational time series baselines with no feature extraction and (ii) provide interpretable insights for personalized interven- tions. Source code: https://github.com/epfl-ml4ed/ripple/.

Transferability

16
Best Fit Contexts
  • Research / curriculum design context
Likely Failure Modes
  • High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.

Cost And Operations

17
Time Cost Notes

Not specified in extracted text unless noted in duration field.

Staffing Notes

Requires educators/researchers/facilitators with sufficient AI literacy and pedagogy knowledge for the target learners.

Infra Notes

Infrastructure depends on AI tool type, learner devices, data access, and institutional policy context.

Extraction Notes

18
Confidence

High

Missing Information
    Reasoning Limits

    This entry was automatically extracted from the PDF text and manifest metadata. Fields should be manually verified before public registry publication, especially group size, location, duration, and outcome claims.

    Duplicate Check Against Uploaded Cases Json
    Closest Existing Title

    Opportunities, challenges and school strategies for integrating generative AI in education

    Similarity Score

    0.482

    Likely Duplicate

    false

    Registry Metadata

    19
    Case ID
    AAB-CASE-2026-RV-108
    Publication Status
    Published curriculum / implementation paper
    Tags
    caseUnspecified / broad educationNot specified in extracted textResearch / curriculum design contextML concepts / supervised learningAI for educationinterpretabilityML concepts / supervised learningExplainable AI / robustnessAssessment support