Explaining Robustness to Catastrophic Forgetting Through Incremental Concept Formation
By: Nicki Barari, Edward Kim, Christopher MacLellan
Potential Business Impact:
Keeps computers learning new things without forgetting old ones.
Catastrophic forgetting remains a central challenge in continual learning, where models are required to integrate new knowledge over time without losing what they have previously learned. In prior work, we introduced Cobweb/4V, a hierarchical concept formation model that exhibited robustness to catastrophic forgetting in visual domains. Motivated by this robustness, we examine three hypotheses regarding the factors that contribute to such stability: (1) adaptive structural reorganization enhances knowledge retention, (2) sparse and selective updates reduce interference, and (3) information-theoretic learning based on sufficiency statistics provides advantages over gradient-based backpropagation. To test these hypotheses, we compare Cobweb/4V with neural baselines, including CobwebNN, a neural implementation of the Cobweb framework introduced in this work. Experiments on datasets of varying complexity (MNIST, Fashion-MNIST, MedMNIST, and CIFAR-10) show that adaptive restructuring enhances learning plasticity, sparse updates help mitigate interference, and the information-theoretic learning process preserves prior knowledge without revisiting past data. Together, these findings provide insight into mechanisms that can mitigate catastrophic forgetting and highlight the potential of concept-based, information-theoretic approaches for building stable and adaptive continual learning systems.
Similar Papers
Sequencing to Mitigate Catastrophic Forgetting in Continual Learning
Machine Learning (CS)
Teaches computers to learn new things without forgetting old ones.
Catastrophic Forgetting in Kolmogorov-Arnold Networks
Machine Learning (CS)
New AI learns without forgetting old lessons.
Mitigating Catastrophic Forgetting in Continual Learning through Model Growth
Computation and Language
Keeps AI smart when learning new things.