Enhanced Interpretable Knowledge Tracing for Students Performance Prediction with Human understandable Feature Space
By: Sein Minn, Roger Nkambou
Potential Business Impact:
Helps learning programs understand how students learn.
Knowledge Tracing (KT) plays a central role in assessing students skill mastery and predicting their future performance. While deep learning based KT models achieve superior predictive accuracy compared to traditional methods, their complexity and opacity hinder their ability to provide psychologically meaningful explanations. This disconnect between model parameters and cognitive theory poses challenges for understanding and enhancing the learning process, limiting their trustworthiness in educational applications. To address these challenges, we enhance interpretable KT models by exploring human-understandable features derived from students interaction data. By incorporating additional features, particularly those reflecting students learning abilities, our enhanced approach improves predictive accuracy while maintaining alignment with cognitive theory. Our contributions aim to balance predictive power with interpretability, advancing the utility of adaptive learning systems.
Similar Papers
Does Interpretability of Knowledge Tracing Models Support Teacher Decision Making?
Machine Learning (CS)
Helps teachers teach students better and faster.
A Hierarchical Probabilistic Framework for Incremental Knowledge Tracing in Classroom Settings
Computation and Language
Helps students learn better with less data.
Disentangled Knowledge Tracing for Alleviating Cognitive Bias
Machine Learning (CS)
Helps learning programs give better challenges.