Score: 0

Communication-Efficient Federated Learning with Adaptive Number of Participants

Published: August 19, 2025 | arXiv ID: 2508.13803v1

By: Sergey Skorik , Vladislav Dorofeev , Gleb Molodtsov and more

Potential Business Impact:

Chooses best clients to train AI faster.

Business Areas:
Crowdsourcing Collaboration

Rapid scaling of deep learning models has enabled performance gains across domains, yet it introduced several challenges. Federated Learning (FL) has emerged as a promising framework to address these concerns by enabling decentralized training. Nevertheless, communication efficiency remains a key bottleneck in FL, particularly under heterogeneous and dynamic client participation. Existing methods, such as FedAvg and FedProx, or other approaches, including client selection strategies, attempt to mitigate communication costs. However, the problem of choosing the number of clients in a training round remains extremely underexplored. We introduce Intelligent Selection of Participants (ISP), an adaptive mechanism that dynamically determines the optimal number of clients per round to enhance communication efficiency without compromising model accuracy. We validate the effectiveness of ISP across diverse setups, including vision transformers, real-world ECG classification, and training with gradient compression. Our results show consistent communication savings of up to 30\% without losing the final quality. Applying ISP to different real-world ECG classification setups highlighted the selection of the number of clients as a separate task of federated learning.

Page Count
24 pages

Category
Computer Science:
Machine Learning (CS)