Self-paced contrastive learning for knowledge tracing

Huan Dai, Yue Yun, Yupei Zhang, Rui An, Wenxin Zhang, Xuequn Shang

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Knowledge tracing presents a fundamental research challenge within personalized education, aiming to dynamically monitor students’ evolving mastery of individual skills through analysis of their online answer data. Noteworthy advancements have been achieved in knowledge tracing models, particularly with the integration of deep learning techniques. These models leverage long-short-term memory (LSTM) networks to process students’ answer sequences, discern latent skill states, and predict subsequent responses. Nevertheless, real-world scenarios introduce a substantial challenge, given the inherent sparsity of student interaction data, implying that each student interacts with only a limited number of skills. Deep knowledge tracing models, when confronted with sparse data, tend to exhibit a bias towards learning from students with large sample sizes, consequently resulting in compromised learning outcomes due to insufficient interaction data. Addressing these challenges, this paper introduces an innovative self-paced learning data augmentation approach designed to augment the number of student answer sequence samples. Additionally, the paper employs self-supervised contrastive learning to strike a balance between local and global information, thereby mitigating the model's inclination to overly focus on local samples. To effectively address the aforementioned challenges, the paper proposes a deep knowledge tracing framework grounded in self-paced contrastive learning. The efficacy of the self-paced contrastive learning strategy is validated using real datasets, demonstrating superior prediction accuracy compared to alternative algorithms.

源语言英语
文章编号128366
期刊Neurocomputing
609
DOI
出版状态已出版 - 7 12月 2024

指纹

探究 'Self-paced contrastive learning for knowledge tracing' 的科研主题。它们共同构成独一无二的指纹。

引用此