Large-Small Model Collaborative Framework for Federated Continual Learning
By: Hao Yu , Xin Yang , Boyang Fan and more
Potential Business Impact:
Helps big AI learn new things without forgetting.
Continual learning (CL) for Foundation Models (FMs) is an essential yet underexplored challenge, especially in Federated Continual Learning (FCL), where each client learns from a private, evolving task stream under strict data and communication constraints. Despite their powerful generalization abilities, FMs often exhibit suboptimal performance on local downstream tasks, as they are unable to utilize private local data. Furthermore, enabling FMs to learn new tasks without forgetting prior knowledge is inherently a challenging problem, primarily due to their immense parameter count and high model complexity. In contrast, small models can be trained locally under resource-constrained conditions and benefit from more mature CL techniques. To bridge the gap between small models and FMs, we propose the first collaborative framework in FCL, where lightweight local models act as a dynamic bridge, continually adapting to new tasks while enhancing the utility of the large model. Two novel components are also included: Small Model Continual Fine-tuning is for preventing small models from temporal forgetting; One-by-One Distillation performs personalized fusion of heterogeneous local knowledge on the server. Experimental results demonstrate its superior performance, even when clients utilize heterogeneous small models.
Similar Papers
Resource-Constrained Federated Continual Learning: What Does Matter?
Machine Learning (CS)
Helps smart devices learn new things without losing old knowledge.
Federated Continual Recommendation
Machine Learning (CS)
Keeps movie suggestions good as you watch more.
Federated Continual Recommendation
Machine Learning (CS)
Keeps movie suggestions good as you watch more.