Distributionally Robust Federated Learning: An ADMM Algorithm
By: Wen Bai , Yi Wong , Xiao Qiao and more
Potential Business Impact:
Helps AI learn from different data sources.
Federated learning (FL) aims to train machine learning (ML) models collaboratively using decentralized data, bypassing the need for centralized data aggregation. Standard FL models often assume that all data come from the same unknown distribution. However, in practical situations, decentralized data frequently exhibit heterogeneity. We propose a novel FL model, Distributionally Robust Federated Learning (DRFL), that applies distributionally robust optimization to overcome the challenges posed by data heterogeneity and distributional ambiguity. We derive a tractable reformulation for DRFL and develop a novel solution method based on the alternating direction method of multipliers (ADMM) algorithm to solve this problem. Our experimental results demonstrate that DRFL outperforms standard FL models under data heterogeneity and ambiguity.
Similar Papers
Federated Learning for Diffusion Models
Machine Learning (CS)
Makes AI learn better from scattered, different data.
FedAPM: Federated Learning via ADMM with Partial Model Personalization
Machine Learning (CS)
Helps AI learn better from different people's phones.
Adaptive Decentralized Federated Learning for Robust Optimization
Machine Learning (CS)
Fixes computer learning when some data is bad.