Distribution-Aware Mobility-Assisted Decentralized Federated Learning
By: Md Farhamdur Reza , Reza Jahani , Richeng Jin and more
Potential Business Impact:
Makes computer learning faster with moving devices.
Decentralized federated learning (DFL) has attracted significant attention due to its scalability and independence from a central server. In practice, some participating clients can be mobile, yet the impact of user mobility on DFL performance remains largely unexplored, despite its potential to facilitate communication and model convergence. In this work, we demonstrate that introducing a small fraction of mobile clients, even with random movement, can significantly improve the accuracy of DFL by facilitating information flow. To further enhance performance, we propose novel distribution-aware mobility patterns, where mobile clients strategically navigate the network, leveraging knowledge of data distributions and static client locations. The proposed moving strategies mitigate the impact of data heterogeneity and boost learning convergence. Extensive experiments validate the effectiveness of induced mobility in DFL and demonstrate the superiority of our proposed mobility patterns over random movement.
Similar Papers
Mobility-Assisted Decentralized Federated Learning: Convergence Analysis and A Data-Driven Approach
Machine Learning (CS)
Makes learning from phones better with moving people.
Mobility-Aware Decentralized Federated Learning with Joint Optimization of Local Iteration and Leader Selection for Vehicular Networks
Networking and Internet Architecture
Cars learn together without sharing private data.
Mobility-Aware Multi-Task Decentralized Federated Learning for Vehicular Networks: Modeling, Analysis, and Optimization
Networking and Internet Architecture
Cars learn together without sharing private driving data.