Data Heterogeneity-Aware Client Selection for Federated Learning in Wireless Networks
By: Yanbing Yang , Huiling Zhu , Wenchi Cheng and more
Potential Business Impact:
Helps phones train AI without sharing private data.
Federated Learning (FL) enables mobile edge devices, functioning as clients, to collaboratively train a decentralized model while ensuring local data privacy. However, the efficiency of FL in wireless networks is limited not only by constraints on communication and computational resources but also by significant data heterogeneity among clients, particularly in large-scale networks. This paper first presents a theoretical analysis of the impact of client data heterogeneity on global model generalization error, which can result in repeated training cycles, increased energy consumption, and prolonged latency. Based on the theoretical insights, an optimization problem is formulated to jointly minimize learning latency and energy consumption while constraining generalization error. A joint client selection and resource allocation (CSRA) approach is then proposed, employing a series of convex optimization and relaxation techniques. Extensive simulation results demonstrate that the proposed CSRA scheme yields higher test accuracy, reduced learning latency, and lower energy consumption compared to baseline methods that do not account for data heterogeneity.
Similar Papers
Client Selection in Federated Learning with Data Heterogeneity and Network Latencies
Machine Learning (CS)
Makes smart computers learn faster from different data.
Heterogeneity-Aware Client Sampling: A Unified Solution for Consistent Federated Learning
Machine Learning (CS)
Fixes AI learning when computers are different.
Robust Federated Learning in Unreliable Wireless Networks: A Client Selection Approach
Distributed, Parallel, and Cluster Computing
Helps computers learn better with bad connections.