Score: 1

No Forgetting Learning: Memory-free Continual Learning

Published: March 6, 2025 | arXiv ID: 2503.04638v2

By: Mohammad Ali Vahedifar, Qi Zhang

Potential Business Impact:

Teaches computers new things without forgetting old ones.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Continual Learning (CL) remains a central challenge in deep learning, where models must sequentially acquire new knowledge while mitigating Catastrophic Forgetting (CF) of prior tasks. Existing approaches often struggle with efficiency and scalability, requiring extensive memory or model buffers. This work introduces ``No Forgetting Learning" (NFL), a memory-free CL framework that leverages knowledge distillation to maintain stability while preserving plasticity. Memory-free means the NFL does not rely on any memory buffer. Through extensive evaluations of three benchmark datasets, we demonstrate that NFL achieves competitive performance while utilizing approximately 14.75 times less memory than state-of-the-art methods. Furthermore, we introduce a new metric to better assess CL's plasticity-stability trade-off.

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)