Score: 1

Forget Less by Learning Together through Concept Consolidation

Published: January 5, 2026 | arXiv ID: 2601.01963v1

By: Arjun Ramesh Kaushik , Naresh Kumar Devulapally , Vishnu Suresh Lokhande and more

Potential Business Impact:

Teaches AI new things without forgetting old ones.

Business Areas:
Image Recognition Data and Analytics, Software

Custom Diffusion Models (CDMs) have gained significant attention due to their remarkable ability to personalize generative processes. However, existing CDMs suffer from catastrophic forgetting when continuously learning new concepts. Most prior works attempt to mitigate this issue under the sequential learning setting with a fixed order of concept inflow and neglect inter-concept interactions. In this paper, we propose a novel framework - Forget Less by Learning Together (FL2T) - that enables concurrent and order-agnostic concept learning while addressing catastrophic forgetting. Specifically, we introduce a set-invariant inter-concept learning module where proxies guide feature selection across concepts, facilitating improved knowledge retention and transfer. By leveraging inter-concept guidance, our approach preserves old concepts while efficiently incorporating new ones. Extensive experiments, across three datasets, demonstrates that our method significantly improves concept retention and mitigates catastrophic forgetting, highlighting the effectiveness of inter-concept catalytic behavior in incremental concept learning of ten tasks with at least 2% gain on average CLIP Image Alignment scores.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
17 pages

Category
Computer Science:
CV and Pattern Recognition