Score: 1

FlatFormer: A Flat Transformer Knowledge Tracing Model Based on Cognitive Bias Injection

Published: December 7, 2025 | arXiv ID: 2512.06629v1

By: Xiao-li Xia, Hou-biao Li

Potential Business Impact:

Helps computers track student learning faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Knowledge Tracing (KT) models face a critical ``Performance-Complexity Trap'': capturing complex cognitive dynamics like learning sessions and memory decay typically requires deep hierarchical architectures, which incur prohibitive computational costs for real-time deployment. To resolve this, we propose FlatFormer, a streamlined architecture based on the novel design paradigm of ``Information Injection over Structural Stacking.'' Unlike parameter-heavy hierarchical models, FlatFormer leverages a standard flat Transformer augmented with two lightweight injection mechanisms: (i) a hybrid input encoding strategy combining learnable session identifiers with fixed sinusoidal step embeddings; and (ii) a pre-computed power-law bias integrated directly into attention logits to explicitly model the forgetting curve. Extensive experiments on four large-scale datasets (e.g., EdNet, Junyi) show that FlatFormer achieves state-of-the-art performance. For example, on the EdNet dataset, compared to the strongest hierarchical baseline (HiTSKT), its absolute AUC increased by 8.3%, while using less than 15% of parameters, and inference speed was about three times faster. These results validate that high cognitive fidelity does not necessitate architectural complexity.

Country of Origin
🇨🇳 China

Page Count
36 pages

Category
Computer Science:
Artificial Intelligence