Local Data Quantity-Aware Weighted Averaging for Federated Learning with Dishonest Clients
By: Leming Wu , Yaochu Jin , Kuangrong Hao and more
Potential Business Impact:
Protects private data while improving AI learning.
Federated learning (FL) enables collaborative training of deep learning models without requiring data to leave local clients, thereby preserving client privacy. The aggregation process on the server plays a critical role in the performance of the resulting FL model. The most commonly used aggregation method is weighted averaging based on the amount of data from each client, which is thought to reflect each client's contribution. However, this method is prone to model bias, as dishonest clients might report inaccurate training data volumes to the server, which is hard to verify. To address this issue, we propose a novel secure \underline{Fed}erated \underline{D}ata q\underline{u}antity-\underline{a}ware weighted averaging method (FedDua). It enables FL servers to accurately predict the amount of training data from each client based on their local model gradients uploaded. Furthermore, it can be seamlessly integrated into any FL algorithms that involve server-side model aggregation. Extensive experiments on three benchmarking datasets demonstrate that FedDua improves the global model performance by an average of 3.17% compared to four popular FL aggregation methods in the presence of inaccurate client data volume declarations.
Similar Papers
FedAWA: Adaptive Optimization of Aggregation Weights in Federated Learning Using Client Vectors
Machine Learning (CS)
Makes smart computers learn better together privately.
FedDuA: Doubly Adaptive Federated Learning
Machine Learning (CS)
Teaches computers faster without sharing private info.
Federated Learning in the Wild: A Comparative Study for Cybersecurity under Non-IID and Unbalanced Settings
Cryptography and Security
Helps computers find online attacks without sharing private data.