Forget Less by Learning from Parents Through Hierarchical Relationships
By: Arjun Ramesh Kaushik , Naresh Kumar Devulapally , Vishnu Suresh Lokhande and more
Potential Business Impact:
Teaches AI new things without forgetting old ones.
Custom Diffusion Models (CDMs) offer impressive capabilities for personalization in generative modeling, yet they remain vulnerable to catastrophic forgetting when learning new concepts sequentially. Existing approaches primarily focus on minimizing interference between concepts, often neglecting the potential for positive inter-concept interactions. In this work, we present Forget Less by Learning from Parents (FLLP), a novel framework that introduces a parent-child inter-concept learning mechanism in hyperbolic space to mitigate forgetting. By embedding concept representations within a Lorentzian manifold, naturally suited to modeling tree-like hierarchies, we define parent-child relationships in which previously learned concepts serve as guidance for adapting to new ones. Our method not only preserves prior knowledge but also supports continual integration of new concepts. We validate FLLP on three public datasets and one synthetic benchmark, showing consistent improvements in both robustness and generalization.
Similar Papers
Forget Less by Learning Together through Concept Consolidation
CV and Pattern Recognition
Teaches AI new things without forgetting old ones.
Mass Concept Erasure in Diffusion Models with Concept Hierarchy
CV and Pattern Recognition
Stops AI from making bad pictures of many things.
PrivacyCD: Hierarchical Unlearning for Protecting Student Privacy in Cognitive Diagnosis
Machine Learning (CS)
Lets AI forget specific student learning data.