Score: 0

Forget Less by Learning from Parents Through Hierarchical Relationships

Published: January 5, 2026 | arXiv ID: 2601.01892v1

By: Arjun Ramesh Kaushik , Naresh Kumar Devulapally , Vishnu Suresh Lokhande and more

Potential Business Impact:

Teaches AI new things without forgetting old ones.

Business Areas:
Parenting Community and Lifestyle

Custom Diffusion Models (CDMs) offer impressive capabilities for personalization in generative modeling, yet they remain vulnerable to catastrophic forgetting when learning new concepts sequentially. Existing approaches primarily focus on minimizing interference between concepts, often neglecting the potential for positive inter-concept interactions. In this work, we present Forget Less by Learning from Parents (FLLP), a novel framework that introduces a parent-child inter-concept learning mechanism in hyperbolic space to mitigate forgetting. By embedding concept representations within a Lorentzian manifold, naturally suited to modeling tree-like hierarchies, we define parent-child relationships in which previously learned concepts serve as guidance for adapting to new ones. Our method not only preserves prior knowledge but also supports continual integration of new concepts. We validate FLLP on three public datasets and one synthetic benchmark, showing consistent improvements in both robustness and generalization.

Country of Origin
🇺🇸 United States

Page Count
19 pages

Category
Computer Science:
CV and Pattern Recognition