Partitioned Memory Storage Inspired Few-Shot Class-Incremental learning
By: Renye Zhang, Yimin Yin, Jinghua Zhang
Potential Business Impact:
Teaches computers to learn new things without forgetting.
Current mainstream deep learning techniques exhibit an over-reliance on extensive training data and a lack of adaptability to the dynamic world, marking a considerable disparity from human intelligence. To bridge this gap, Few-Shot Class-Incremental Learning (FSCIL) has emerged, focusing on continuous learning of new categories with limited samples without forgetting old knowledge. Existing FSCIL studies typically use a single model to learn knowledge across all sessions, inevitably leading to the stability-plasticity dilemma. Unlike machines, humans store varied knowledge in different cerebral cortices. Inspired by this characteristic, our paper aims to develop a method that learns independent models for each session. It can inherently prevent catastrophic forgetting. During the testing stage, our method integrates Uncertainty Quantification (UQ) for model deployment. Our method provides a fresh viewpoint for FSCIL and demonstrates the state-of-the-art performance on CIFAR-100 and mini-ImageNet datasets.
Similar Papers
An experimental approach on Few Shot Class Incremental Learning
Machine Learning (CS)
Teaches computers to learn new things without forgetting.
Breaking Forgetting: Training-Free Few-Shot Class-Incremental Learning via Conditional Diffusion
CV and Pattern Recognition
Teaches computers new things without retraining.
Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.