Enhancing knowledge retention for continual learning with domain-specific adapters and features gating
By: Mohamed Abbas Hedjazi, Oussama Hadjerci, Adel Hafiane
Potential Business Impact:
Keeps computer learning new things without forgetting old ones.
Continual learning empowers models to learn from a continuous stream of data while preserving previously acquired knowledge, effectively addressing the challenge of catastrophic forgetting. In this study, we propose a new approach that integrates adapters within the self-attention mechanisms of Vision Transformers to enhance knowledge retention when sequentially adding datasets from different domains. Unlike previous methods that continue learning with only one dataset, our approach introduces domain-specific output heads and feature gating, allowing the model to maintain high accuracy on previously learned tasks while incorporating only the essential information from multiple domains. The proposed method is compared to prominent parameter-efficient fine-tuning methods in the current state of the art. The results provide evidence that our method effectively alleviates the limitations of previous works. Furthermore, we conduct a comparative analysis using three datasets, CIFAR-100, Flowers102, and DTD, each representing a distinct domain, to investigate the impact of task order on model performance. Our findings underscore the critical role of dataset sequencing in shaping learning outcomes, demonstrating that strategic ordering can significantly improve the model's ability to adapt to evolving data distributions over time while preserving the integrity of previously learned knowledge.
Similar Papers
Class-aware Domain Knowledge Fusion and Fission for Continual Test-Time Adaptation
CV and Pattern Recognition
Helps AI learn new things without forgetting old ones.
Continual Learning for Multiple Modalities
CV and Pattern Recognition
Teaches computers to learn new things without forgetting.
Continual Learning in Vision-Language Models via Aligned Model Merging
CV and Pattern Recognition
Keeps computer memory from forgetting old lessons.