An experimental approach on Few Shot Class Incremental Learning
By: Marinela Adam
Potential Business Impact:
Teaches computers to learn new things without forgetting.
Few-Shot Class-Incremental Learning (FSCIL) represents a cutting-edge paradigm within the broader scope of machine learning, designed to empower models with the ability to assimilate new classes of data with limited examples while safeguarding existing knowledge. The paper will present different solutions which contain extensive experiments across large-scale datasets, domain shifts, and network architectures to evaluate and compare the selected methods. We highlight their advantages and then present an experimental approach with the purpose of improving the most promising one by replacing the visual-language (V-L) model (CLIP) with another V-L model (CLOOB) that seem to outperform it on zero-shot learning tasks. The aim of this report is to present an experimental method for FSCIL that would improve its performance. We also plan to offer an overview followed by an analysis of the recent advancements in FSCIL domain, focusing on various strategies to mitigate catastrophic forgetting and improve the adaptability of models to evolving tasks and datasets.
Similar Papers
Partitioned Memory Storage Inspired Few-Shot Class-Incremental learning
Artificial Intelligence
Teaches computers to learn new things without forgetting.
Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
Automatic Attack Discovery for Few-Shot Class-Incremental Learning via Large Language Models
Machine Learning (CS)
Makes AI forget lessons when learning new things.