Human-like fleeting memory improves language learning but impairs reading time prediction in transformer language models
By: Abishek Thamma, Micha Heilbron
Potential Business Impact:
Makes computers learn language better, but not predict reading.
Human memory is fleeting. As words are processed, the exact wordforms that make up incoming sentences are rapidly lost. Cognitive scientists have long believed that this limitation of memory may, paradoxically, help in learning language - an idea supported by classic connectionist modelling work. The rise of Transformers appears to challenge this idea, as these models can learn language effectively, despite lacking memory limitations or other architectural recency biases. Here, we investigate the hypothesized benefit of fleeting memory for language learning in tightly controlled experiments on transformer language models. Training transformers with and without fleeting memory on a developmentally realistic training set, we find that fleeting memory consistently improves language learning (as quantified by both overall language modelling performance and targeted syntactic evaluation) but, unexpectedly, impairs surprisal-based prediction of human reading times. Interestingly, follow up analyses revealed that this discrepancy - better language modeling, yet worse reading time prediction - could not be accounted for by prior explanations of why better language models sometimes fit human reading time worse. Together, these results support a benefit of memory limitations on neural network language learning - but not on predicting behavior.
Similar Papers
Surprisal from Larger Transformer-based Language Models Predicts fMRI Data More Poorly
Computation and Language
Brain scans show how well computers understand words.
Memory Limitations of Prompt Tuning in Transformers
Machine Learning (CS)
Computers forget things when given too much information.
Modeling cognitive processes of natural reading with transformer-based Language Models
Computation and Language
Helps computers understand how people read.