Federated learning over physical channels: adaptive algorithms with near-optimal guarantees
By: Rui Zhang, Wenlong Mou
Potential Business Impact:
Lets computers learn from phones without sending data.
In federated learning, communication cost can be significantly reduced by transmitting the information over the air through physical channels. In this paper, we propose a new class of adaptive federated stochastic gradient descent (SGD) algorithms that can be implemented over physical channels, taking into account both channel noise and hardware constraints. We establish theoretical guarantees for the proposed algorithms, demonstrating convergence rates that are adaptive to the stochastic gradient noise level. We also demonstrate the practical effectiveness of our algorithms through simulation studies with deep learning models.
Similar Papers
Rethinking Federated Learning Over the Air: The Blessing of Scaling Up
Machine Learning (CS)
Lets many computers learn together, privately.
Communication-Efficient Zero-Order and First-Order Federated Learning Methods over Wireless Networks
Machine Learning (CS)
Makes phones learn together without sharing secrets.
Federated Learning on Stochastic Neural Networks
Machine Learning (CS)
Cleans up messy data for smarter AI.