Measuring the Impact of Student Gaming Behaviors on Learner Modeling
By: Qinyi Liu , Lin Li , Valdemar Švábenský and more
The expansion of large-scale online education platforms has made vast amounts of student interaction data available for knowledge tracing (KT). KT models estimate students' concept mastery from interaction data, but their performance is sensitive to input data quality. Gaming behaviors, such as excessive hint use, may misrepresent students' knowledge and undermine model reliability. However, systematic investigations of how different types of gaming behaviors affect KT remain scarce, and existing studies rely on costly manual analysis that does not capture behavioral diversity. In this study, we conceptualize gaming behaviors as a form of data poisoning, defined as the deliberate submission of incorrect or misleading interaction data to corrupt a model's learning process. We design Data Poisoning Attacks (DPAs) to simulate diverse gaming patterns and systematically evaluate their impact on KT model performance. Moreover, drawing on advances in DPA detection, we explore unsupervised approaches to enhance the generalizability of gaming behavior detection. We find that KT models' performance tends to decrease especially in response to random guess behaviors. Our findings provide insights into the vulnerabilities of KT models and highlight the potential of adversarial methods for improving the robustness of learning analytics systems.
Similar Papers
Investigating the Robustness of Knowledge Tracing Models in the Presence of Student Concept Drift
Machine Learning (CS)
Helps online learning systems adapt to student changes.
Unveiling Gamer Archetypes through Multi modal feature Correlations and Unsupervised Learning
Human-Computer Interaction
Finds four types of gamers to improve games.
Disentangled Knowledge Tracing for Alleviating Cognitive Bias
Machine Learning (CS)
Helps learning programs give better challenges.