Score: 2

Inclusive Training Separation and Implicit Knowledge Interaction for Balanced Online Class-Incremental Learning

Published: April 29, 2025 | arXiv ID: 2504.20566v1

By: Shunjie Wen, Thomas Heinis, Dong-Wan Choi

Potential Business Impact:

Teaches computers to learn new things without forgetting old ones.

Business Areas:
MOOC Education, Software

Online class-incremental learning (OCIL) focuses on gradually learning new classes (called plasticity) from a stream of data in a single-pass, while concurrently preserving knowledge of previously learned classes (called stability). The primary challenge in OCIL lies in maintaining a good balance between the knowledge of old and new classes within the continually updated model. Most existing methods rely on explicit knowledge interaction through experience replay, and often employ exclusive training separation to address bias problems. Nevertheless, it still remains a big challenge to achieve a well-balanced learner, as these methods often exhibit either reduced plasticity or limited stability due to difficulties in continually integrating knowledge in the OCIL setting. In this paper, we propose a novel replay-based method, called Balanced Online Incremental Learning (BOIL), which can achieve both high plasticity and stability, thus ensuring more balanced performance in OCIL. Our BOIL method proposes an inclusive training separation strategy using dual classifiers so that knowledge from both old and new classes can effectively be integrated into the model, while introducing implicit approaches for transferring knowledge across the two classifiers. Extensive experimental evaluations over three widely-used OCIL benchmark datasets demonstrate the superiority of BOIL, showing more balanced yet better performance compared to state-of-the-art replay-based OCIL methods.

Country of Origin
🇰🇷 🇬🇧 United Kingdom, Korea, Republic of

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)