Score: 1

Partitioned Memory Storage Inspired Few-Shot Class-Incremental learning

Published: April 29, 2025 | arXiv ID: 2504.20797v1

By: Renye Zhang, Yimin Yin, Jinghua Zhang

Potential Business Impact:

Teaches computers to learn new things without forgetting.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Current mainstream deep learning techniques exhibit an over-reliance on extensive training data and a lack of adaptability to the dynamic world, marking a considerable disparity from human intelligence. To bridge this gap, Few-Shot Class-Incremental Learning (FSCIL) has emerged, focusing on continuous learning of new categories with limited samples without forgetting old knowledge. Existing FSCIL studies typically use a single model to learn knowledge across all sessions, inevitably leading to the stability-plasticity dilemma. Unlike machines, humans store varied knowledge in different cerebral cortices. Inspired by this characteristic, our paper aims to develop a method that learns independent models for each session. It can inherently prevent catastrophic forgetting. During the testing stage, our method integrates Uncertainty Quantification (UQ) for model deployment. Our method provides a fresh viewpoint for FSCIL and demonstrates the state-of-the-art performance on CIFAR-100 and mini-ImageNet datasets.

Page Count
13 pages

Category
Computer Science:
Artificial Intelligence