Semi-parametric Memory Consolidation: Towards Brain-like Deep Continual Learning
By: Geng Liu , Fei Zhu , Rong Feng and more
Potential Business Impact:
Computers remember old lessons while learning new ones.
Humans and most animals inherently possess a distinctive capacity to continually acquire novel experiences and accumulate worldly knowledge over time. This ability, termed continual learning, is also critical for deep neural networks (DNNs) to adapt to the dynamically evolving world in open environments. However, DNNs notoriously suffer from catastrophic forgetting of previously learned knowledge when trained on sequential tasks. In this work, inspired by the interactive human memory and learning system, we propose a novel biomimetic continual learning framework that integrates semi-parametric memory and the wake-sleep consolidation mechanism. For the first time, our method enables deep neural networks to retain high performance on novel tasks while maintaining prior knowledge in real-world challenging continual learning scenarios, e.g., class-incremental learning on ImageNet. This study demonstrates that emulating biological intelligence provides a promising path to enable deep neural networks with continual learning capabilities.
Similar Papers
Hybrid Learners Do Not Forget: A Brain-Inspired Neuro-Symbolic Approach to Continual Learning
Machine Learning (CS)
AI learns new things without forgetting old ones.
Rethinking Continual Learning with Progressive Neural Collapse
Machine Learning (CS)
Teaches computers to learn new things without forgetting old ones.
On the Theory of Continual Learning with Gradient Descent for Neural Networks
Machine Learning (Stat)
Helps AI remember old lessons while learning new ones.