An Adaptive Clustering Scheme for Client Selections in Communication-Efficient Federated Learning
By: Yan-Ann Chen, Guan-Lin Chen
Potential Business Impact:
Smartly groups users to train computers faster.
Federated learning is a novel decentralized learning architecture. During the training process, the client and server must continuously upload and receive model parameters, which consumes a lot of network transmission resources. Some methods use clustering to find more representative customers, select only a part of them for training, and at the same time ensure the accuracy of training. However, in federated learning, it is not trivial to know what the number of clusters can bring the best training result. Therefore, we propose to dynamically adjust the number of clusters to find the most ideal grouping results. It may reduce the number of users participating in the training to achieve the effect of reducing communication costs without affecting the model performance. We verify its experimental results on the non-IID handwritten digit recognition dataset and reduce the cost of communication and transmission by almost 50% compared with traditional federated learning without affecting the accuracy of the model.
Similar Papers
Client Selection in Federated Learning with Data Heterogeneity and Network Latencies
Machine Learning (CS)
Makes smart computers learn faster from different data.
Communication-Efficient Federated Learning with Adaptive Number of Participants
Machine Learning (CS)
Chooses best clients to train AI faster.
Sociodynamics-inspired Adaptive Coalition and Client Selection in Federated Learning
Machine Learning (CS)
Groups computers to learn better from different data.