Noise Resilient Over-The-Air Federated Learning In Heterogeneous Wireless Networks
By: Zubair Shaban, Nazreen Shah, Ranjitha Prasad
Potential Business Impact:
Makes AI learn better from many phones.
In 6G wireless networks, Artificial Intelligence (AI)-driven applications demand the adoption of Federated Learning (FL) to enable efficient and privacy-preserving model training across distributed devices. Over-The-Air Federated Learning (OTA-FL) exploits the superposition property of multiple access channels, allowing edge users in 6G networks to efficiently share spectral resources and perform low-latency global model aggregation. However, these advantages come with challenges, as traditional OTA-FL techniques suffer due to the joint effects of Additive White Gaussian Noise (AWGN) at the server, fading, and both data and system heterogeneity at the participating edge devices. In this work, we propose the novel Noise Resilient Over-the-Air Federated Learning (NoROTA-FL) framework to jointly tackle these challenges in federated wireless networks. In NoROTA-FL, the local optimization problems find controlled inexact solutions, which manifests as an additional proximal constraint at the clients. This approach provides robustness against straggler-induced partial work, heterogeneity, noise, and fading. From a theoretical perspective, we leverage the zeroth- and first-order inexactness and establish convergence guarantees for non-convex optimization problems in the presence of heterogeneous data and varying system capabilities. Experimentally, we validate NoROTA-FL on real-world datasets, including FEMNIST, CIFAR10, and CIFAR100, demonstrating its robustness in noisy and heterogeneous environments. Compared to state-of-the-art baselines such as COTAF and FedProx, NoROTA-FL achieves significantly more stable convergence and higher accuracy, particularly in the presence of stragglers.
Similar Papers
Biased Federated Learning under Wireless Heterogeneity
Machine Learning (CS)
Trains AI faster on phones without sharing data.
Non-Convex Over-the-Air Heterogeneous Federated Learning: A Bias-Variance Trade-off
Machine Learning (CS)
Lets phones train AI together faster wirelessly.
Non-Convex Over-the-Air Heterogeneous Federated Learning: A Bias-Variance Trade-off
Machine Learning (CS)
Lets phones learn together without sending private data.