Score: 0

Specifying What You Know or Not for Multi-Label Class-Incremental Learning

Published: March 21, 2025 | arXiv ID: 2503.17017v1

By: Aoting Zhang , Dongbao Yang , Chang Liu and more

Potential Business Impact:

Teaches computers to learn new things without forgetting.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Existing class incremental learning is mainly designed for single-label classification task, which is ill-equipped for multi-label scenarios due to the inherent contradiction of learning objectives for samples with incomplete labels. We argue that the main challenge to overcome this contradiction in multi-label class-incremental learning (MLCIL) lies in the model's inability to clearly distinguish between known and unknown knowledge. This ambiguity hinders the model's ability to retain historical knowledge, master current classes, and prepare for future learning simultaneously. In this paper, we target at specifying what is known or not to accommodate Historical, Current, and Prospective knowledge for MLCIL and propose a novel framework termed as HCP. Specifically, (i) we clarify the known classes by dynamic feature purification and recall enhancement with distribution prior, enhancing the precision and retention of known information. (ii) We design prospective knowledge mining to probe the unknown, preparing the model for future learning. Extensive experiments validate that our method effectively alleviates catastrophic forgetting in MLCIL, surpassing the previous state-of-the-art by 3.3% on average accuracy for MS-COCO B0-C10 setting without replay buffers.

Country of Origin
🇨🇳 China

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)