Cellular Traffic Prediction via Byzantine-robust Asynchronous Federated Learning
By: Hui Ma, Kai Yang, Yang Jiao
Potential Business Impact:
Protects internet data while predicting network traffic.
Network traffic prediction plays a crucial role in intelligent network operation. Traditional prediction methods often rely on centralized training, necessitating the transfer of vast amounts of traffic data to a central server. This approach can lead to latency and privacy concerns. To address these issues, federated learning integrated with differential privacy has emerged as a solution to improve data privacy and model robustness in distributed settings. Nonetheless, existing federated learning protocols are vulnerable to Byzantine attacks, which may significantly compromise model robustness. Developing a robust and privacy-preserving prediction model in the presence of Byzantine clients remains a significant challenge. To this end, we propose an asynchronous differential federated learning framework based on distributionally robust optimization. The proposed framework utilizes multiple clients to train the prediction model collaboratively with local differential privacy. In addition, regularization techniques have been employed to further improve the Byzantine robustness of the models. We have conducted extensive experiments on three real-world datasets, and the results elucidate that our proposed distributed algorithm can achieve superior performance over existing methods.
Similar Papers
Asynchronous Secure Federated Learning with Byzantine aggregators
Distributed, Parallel, and Cluster Computing
Keeps your private data safe while learning.
Byzantine-Robust Federated Learning Using Generative Adversarial Networks
Cryptography and Security
Keeps AI learning safe from bad data.
Online Decentralized Federated Multi-task Learning With Trustworthiness in Cyber-Physical Systems
Machine Learning (CS)
Makes self-driving cars work even with bad actors.