Score: 1

Local Data Quantity-Aware Weighted Averaging for Federated Learning with Dishonest Clients

Published: April 17, 2025 | arXiv ID: 2504.12577v1

By: Leming Wu , Yaochu Jin , Kuangrong Hao and more

Potential Business Impact:

Protects private data while improving AI learning.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Federated learning (FL) enables collaborative training of deep learning models without requiring data to leave local clients, thereby preserving client privacy. The aggregation process on the server plays a critical role in the performance of the resulting FL model. The most commonly used aggregation method is weighted averaging based on the amount of data from each client, which is thought to reflect each client's contribution. However, this method is prone to model bias, as dishonest clients might report inaccurate training data volumes to the server, which is hard to verify. To address this issue, we propose a novel secure \underline{Fed}erated \underline{D}ata q\underline{u}antity-\underline{a}ware weighted averaging method (FedDua). It enables FL servers to accurately predict the amount of training data from each client based on their local model gradients uploaded. Furthermore, it can be seamlessly integrated into any FL algorithms that involve server-side model aggregation. Extensive experiments on three benchmarking datasets demonstrate that FedDua improves the global model performance by an average of 3.17% compared to four popular FL aggregation methods in the presence of inaccurate client data volume declarations.

Country of Origin
πŸ‡ΈπŸ‡¬ πŸ‡¨πŸ‡³ Singapore, China

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)