Towards a Larger Model via One-Shot Federated Learning on Heterogeneous Client Models
By: Wenxuan Ye , Xueli An , Onur Ayan and more
Potential Business Impact:
Trains big AI models using less data.
Large models, renowned for superior performance, outperform smaller ones even without billion-parameter scales. While mobile network servers have ample computational resources to support larger models than client devices, privacy constraints prevent clients from directly sharing their raw data. Federated Learning (FL) enables decentralized clients to collaboratively train a shared model by exchanging model parameters instead of transmitting raw data. Yet, it requires a uniform model architecture and multiple communication rounds, which neglect resource heterogeneity, impose heavy computational demands on clients, and increase communication overhead. To address these challenges, we propose FedOL, to construct a larger and more comprehensive server model in one-shot settings (i.e., in a single communication round). Instead of model parameter sharing, FedOL employs knowledge distillation, where clients only exchange model prediction outputs on an unlabeled public dataset. This reduces communication overhead by transmitting compact predictions instead of full model weights and enables model customization by allowing heterogeneous model architectures. A key challenge in this setting is that client predictions may be biased due to skewed local data distributions, and the lack of ground-truth labels in the public dataset further complicates reliable learning. To mitigate these issues, FedOL introduces a specialized objective function that iteratively refines pseudo-labels and the server model, improving learning reliability. To complement this, FedOL incorporates a tailored pseudo-label generation and knowledge distillation strategy that effectively integrates diverse knowledge. Simulation results show that FedOL significantly outperforms existing baselines, offering a cost-effective solution for mobile networks where clients possess valuable private data but limited computational resources.
Similar Papers
Not All Clients Are Equal: Personalized Federated Learning on Heterogeneous Multi-Modal Clients
Machine Learning (CS)
AI learns from everyone without sharing private data.
FedPromo: Federated Lightweight Proxy Models at the Edge Bring New Domains to Foundation Models
CV and Pattern Recognition
Makes big AI models work on small devices.
Beyond Aggregation: Guiding Clients in Heterogeneous Federated Learning
Machine Learning (CS)
Directs patients to the best hospital for them.