Sociodynamics-inspired Adaptive Coalition and Client Selection in Federated Learning
By: Alessandro Licciardi , Roberta Raineri , Anton Proskurnikov and more
Potential Business Impact:
Groups computers to learn better from different data.
Federated Learning (FL) enables privacy-preserving collaborative model training, yet its practical strength is often undermined by client data heterogeneity, which severely degrades model performance. This paper proposes that data heterogeneity across clients' distributions can be effectively addressed by adopting an approach inspired by opinion dynamics over temporal social networks. We introduce \shortname (Federated Coalition Variance Reduction with Boltzmann Exploration), a variance-reducing selection algorithm in which (1) clients dynamically organize into non-overlapping clusters based on asymptotic agreements, and (2) from each cluster, one client is selected to minimize the expected variance of its model update. Our experiments show that in heterogeneous scenarios our algorithm outperforms existing FL algorithms, yielding more accurate results and faster convergence, validating the efficacy of our approach.
Similar Papers
Client Selection in Federated Learning with Data Heterogeneity and Network Latencies
Machine Learning (CS)
Makes smart computers learn faster from different data.
Hierarchical Federated Learning for Social Network with Mobility
Machine Learning (CS)
Trains AI smarter, using less phone power.
Heterogeneity-Aware Client Sampling: A Unified Solution for Consistent Federated Learning
Machine Learning (CS)
Fixes AI learning when computers are different.