CLGT: A Graph Transformer for Student Performance Prediction in Collaborative Learning
Modeling and predicting the performance of students in col- laborative learning paradigms is an important task. Most of the research presented in literature regarding collaborative learning focuses on the discussion forums and social learning networks.
Implementation
Source publication / research team or educational organization described in paper
Learning context
Research / curriculum design context
AI role
Evaluator
Outcome signal
Conceptual understanding
Registry Facets
- Unspecified / broad education
- AI for education
- collaborative learning
- Explainable AI / robustness
- Assessment / tutoring analytics
- Curriculum / course design
- Teacher professional development
- Assessment support
- Students
- Teachers
- Adult learners / professionals
- Researchers
- Explainable AI / robustness
- Assessment / tutoring analytics
- Research / curriculum design context
- Learning analytics
- Activity documentation
- Conceptual understanding
- Teacher readiness
- Assessment / feedback quality
Implementing Organization
Source publication / research team or educational organization described in paper
Not specified in extracted text
Researchers, educators, instructors, or facilitators as described in the source publication
Learning Context
- Research / curriculum design context
Course implementation or course design
Not specified in extracted text
Not specified in extracted text
Explainable AI / robustness, Assessment / tutoring analytics
- Teacher readiness, time, support, and classroom integration may affect implementation quality.
- High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.
Learner Profile
Unspecified / broad education
Mixed or not explicitly specified; infer from target learner group and intervention design.
Varies by intervention; not specified unless the paper explicitly describes prerequisites.
Educational Intent
- Document the AI education intervention, course, tool, or resource described in the source publication.
- Extract the learner context, AI role, pedagogy, outcomes, and constraints for AAB registry comparison.
- Modeling and predicting the performance of students in col- laborative learning paradigms is an important task.
- Support AAB comparison across AI literacy, AI education, teacher training, higher education, and workforce contexts.
- Capture evidence maturity, transferability, and limitations rather than treating the publication as product endorsement.
- Not an AAB endorsement of the tool, curriculum, provider, or result.
- Not a direct replication record unless the source paper reports implementation details sufficient for replication.
AI Tool Description
Explainable AI / robustness, Assessment / tutoring analytics
Not specified in extracted text
- Evaluator
- Primary interaction pattern inferred from publication: Curriculum / course design, Teacher professional development, Assessment support.
- AI capability focus: Explainable AI / robustness, Assessment / tutoring analytics.
- Minimize personal data collection and avoid storing identifiable learner media unless approved by local policy/IRB.
Activity Design
- Review the publication’s reported context, learner group, AI tool or curriculum, implementation process, and outcome evidence.
- Map the case to AAB registry fields for comparison across educational levels and AI capability types.
- Use the source publication and PDF for any manual verification before public registry release.
- Human educators/researchers remain responsible for instructional design, supervision, interpretation, and ethical safeguards.
- AI systems or AI concepts provide the learning object, support tool, evaluator, simulator, or automation context depending on the paper.
- Instructional / curriculum-based learning
- Registry extraction emphasizes explicit learning goals, observed outcomes, constraints, and safety limitations.
Observed Challenges
- Teacher readiness, time, support, and classroom integration may affect implementation quality.
- High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.
Design Adaptations
- Case classified under: Published curriculum / implementation paper.
- Pedagogical pattern: Instructional / curriculum-based learning.
- Any additional adaptations should be verified against the full paper before public-facing publication.
Reported Outcomes
- Engagement evidence should be interpreted according to the source paper’s reported method and sample.
- Most of the research presented in literature regarding collaborative learning focuses on the discussion forums and social learning networks.
- Most of the research presented in literature regarding collaborative learning focuses on the discussion forums and social learning networks.
Modeling and predicting the performance of students in col- laborative learning paradigms is an important task. Most of the research presented in literature regarding collaborative learning focuses on the discussion forums and social learning networks.
Ethical & Privacy Considerations
- Minimize personal data collection and avoid storing identifiable learner media unless approved by local policy/IRB.
Evidence Type
- Learning analytics
- Activity documentation
Relevance to Research
- Can be used as an AAB evidence record for cross-case comparison, standards drafting, and evidence-maturity mapping.
- Supports identification of recurring patterns in AI literacy, AI education implementation, teacher preparation, assessment, and responsible AI learning.
- Conceptual understanding
- Teacher readiness
- Assessment / feedback quality
- Curriculum / course design
- Teacher professional development
- Assessment support
- Explainable AI / robustness
- Assessment / tutoring analytics
Case Status
- Completed
AAB Classification Tags
Unspecified / broad education
Research / curriculum design context
Explainable AI / robustness, Assessment / tutoring analytics
Instructional / curriculum-based learning
High
High
Source Publication
CLGT: A Graph Transformer for Student Performance Prediction in Collaborative Learning
- Tianhao Peng
- Yu Liang
- Wenjun Wu
- Jian Ren
- Zhao Pengrui
- Yanjun Pu
Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 37 No. 13, EAAI-23
2023
10.1609/aaai.v37i13.26893
https://ojs.aaai.org/index.php/AAAI/article/view/26893
https://ojs.aaai.org/index.php/AAAI/article/view/26893/26665
084_CLGT_ A Graph Transformer for Student Performance Prediction in Collaborative Learning.pdf
8
Modeling and predicting the performance of students in col- laborative learning paradigms is an important task. Most of the research presented in literature regarding collaborative learning focuses on the discussion forums and social learning networks. There are only a few works that investigate how students interact with each other in team projects and how such interactions affect their academic performance. In order to bridge this gap, we choose a software engineering course as the study subject. The students who participate in a soft- ware engineering course are required to team up and complete a software project together. In this work, we construct an in- teraction graph based on the activities of students grouped in various teams. Based on this student interaction graph, we present an extended graph transformer framework for col- laborative learning (CLGT) for evaluating and predicting the performance of students. Moreover, the proposed CLGT con- tains an interpretation module that explains the prediction re- sults and visualizes the student interaction patterns. The ex- perimental results confirm that the proposed CLGT outper- forms the baseline models in terms of performing predictions based on the real-world datasets. Moreover, the proposed CLGT differentiates the students with poor performance in the collaborative learning paradigm and gives teachers early warnings, so that appropriate assistance can be provided.
Transferability
- Research / curriculum design context
- Teacher readiness, time, support, and classroom integration may affect implementation quality.
- High-stakes or student-data-centered AI use requires stronger governance, transparency, and bias monitoring.
Cost And Operations
Not specified in extracted text unless noted in duration field.
Requires educators/researchers/facilitators with sufficient AI literacy and pedagogy knowledge for the target learners.
Infrastructure depends on AI tool type, learner devices, data access, and institutional policy context.
Extraction Notes
High
- group_size
- duration
This entry was automatically extracted from the PDF text and manifest metadata. Fields should be manually verified before public registry publication, especially group size, location, duration, and outcome claims.
Integrating Generative AI into Programming Education: Student Perceptions and the Challenge of Correcting AI Errors
0.418
false
