Scalable Class-Incremental Learning Based on Parametric Neural Collapse
By: Chuangxin Zhang , Guangfeng Lin , Enhui Zhao and more
Potential Business Impact:
Teaches computers new things without forgetting old ones.
Incremental learning often encounter challenges such as overfitting to new data and catastrophic forgetting of old data. Existing methods can effectively extend the model for new tasks while freezing the parameters of the old model, but ignore the necessity of structural efficiency to lead to the feature difference between modules and the class misalignment due to evolving class distributions. To address these issues, we propose scalable class-incremental learning based on parametric neural collapse (SCL-PNC) that enables demand-driven, minimal-cost backbone expansion by adapt-layer and refines the static into a dynamic parametric Equiangular Tight Frame (ETF) framework according to incremental class. This method can efficiently handle the model expansion question with the increasing number of categories in real-world scenarios. Additionally, to counteract feature drift in serial expansion models, the parallel expansion framework is presented with a knowledge distillation algorithm to align features across expansion modules. Therefore, SCL-PNC can not only design a dynamic and extensible ETF classifier to address class misalignment due to evolving class distributions, but also ensure feature consistency by an adapt-layer with knowledge distillation between extended modules. By leveraging neural collapse, SCL-PNC induces the convergence of the incremental expansion model through a structured combination of the expandable backbone, adapt-layer, and the parametric ETF classifier. Experiments on standard benchmarks demonstrate the effectiveness and efficiency of our proposed method. Our code is available at https://github.com/zhangchuangxin71-cyber/dynamic_ ETF2. Keywords: Class incremental learning; Catastrophic forgetting; Neural collapse;Knowledge distillation; Expanded model.
Similar Papers
Rethinking Continual Learning with Progressive Neural Collapse
Machine Learning (CS)
Teaches computers to learn new things without forgetting old ones.
Enhancing Pre-Trained Model-Based Class-Incremental Learning through Neural Collapse
Machine Learning (CS)
Teaches computers to learn new things without forgetting.
3D-ANC: Adaptive Neural Collapse for Robust 3D Point Cloud Recognition
CV and Pattern Recognition
Makes 3D object recognition safer from tricks.