Forget Less by Learning Together through Concept Consolidation
By: Arjun Ramesh Kaushik , Naresh Kumar Devulapally , Vishnu Suresh Lokhande and more
Potential Business Impact:
Teaches AI new things without forgetting old ones.
Custom Diffusion Models (CDMs) have gained significant attention due to their remarkable ability to personalize generative processes. However, existing CDMs suffer from catastrophic forgetting when continuously learning new concepts. Most prior works attempt to mitigate this issue under the sequential learning setting with a fixed order of concept inflow and neglect inter-concept interactions. In this paper, we propose a novel framework - Forget Less by Learning Together (FL2T) - that enables concurrent and order-agnostic concept learning while addressing catastrophic forgetting. Specifically, we introduce a set-invariant inter-concept learning module where proxies guide feature selection across concepts, facilitating improved knowledge retention and transfer. By leveraging inter-concept guidance, our approach preserves old concepts while efficiently incorporating new ones. Extensive experiments, across three datasets, demonstrates that our method significantly improves concept retention and mitigates catastrophic forgetting, highlighting the effectiveness of inter-concept catalytic behavior in incremental concept learning of ten tasks with at least 2% gain on average CLIP Image Alignment scores.
Similar Papers
Forget Less by Learning from Parents Through Hierarchical Relationships
CV and Pattern Recognition
Teaches AI new things without forgetting old ones.
Sculpting Memory: Multi-Concept Forgetting in Diffusion Models via Dynamic Mask and Concept-Aware Optimization
CV and Pattern Recognition
Removes unwanted images from AI art.
ConceptGuard: Continual Personalized Text-to-Image Generation with Forgetting and Confusion Mitigation
CV and Pattern Recognition
Teaches AI to learn new things without forgetting old ones.