No Forgetting Learning: Memory-free Continual Learning
By: Mohammad Ali Vahedifar, Qi Zhang
Potential Business Impact:
Teaches computers new things without forgetting old ones.
Continual Learning (CL) remains a central challenge in deep learning, where models must sequentially acquire new knowledge while mitigating Catastrophic Forgetting (CF) of prior tasks. Existing approaches often struggle with efficiency and scalability, requiring extensive memory or model buffers. This work introduces ``No Forgetting Learning" (NFL), a memory-free CL framework that leverages knowledge distillation to maintain stability while preserving plasticity. Memory-free means the NFL does not rely on any memory buffer. Through extensive evaluations of three benchmark datasets, we demonstrate that NFL achieves competitive performance while utilizing approximately 14.75 times less memory than state-of-the-art methods. Furthermore, we introduce a new metric to better assess CL's plasticity-stability trade-off.
Similar Papers
Gradient-free Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
Task-Core Memory Management and Consolidation for Long-term Continual Learning
Machine Learning (CS)
Keeps computers remembering old lessons while learning new ones.
Flashbacks to Harmonize Stability and Plasticity in Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.