Adaptive Latent-Space Constraints in Personalized FL
By: Sana Ayromlou, D. B. Emerson
Potential Business Impact:
Helps AI learn better from different data.
Federated learning (FL) has become an effective and widely used approach to training deep learning models on decentralized datasets held by distinct clients. FL also strengthens both security and privacy protections for training data. Common challenges associated with statistical heterogeneity between distributed datasets have spurred significant interest in personalized FL (pFL) methods, where models combine aspects of global learning with local modeling specific to each client's unique characteristics. In this work, the efficacy of theoretically supported, adaptive MMD measures within the Ditto framework, a state-of-the-art technique in pFL, are investigated. The use of such measures significantly improves model performance across a variety of tasks, especially those with pronounced feature heterogeneity. While the Ditto algorithm is specifically considered, such measures are directly applicable to a number of other pFL settings, and the results motivate the use of constraints tailored to the various kinds of heterogeneity expected in FL systems.
Similar Papers
Not All Clients Are Equal: Personalized Federated Learning on Heterogeneous Multi-Modal Clients
Machine Learning (CS)
AI learns from everyone without sharing private data.
Convergence-Privacy-Fairness Trade-Off in Personalized Federated Learning
Machine Learning (CS)
Keeps private data safe while learning.
Asynchronous Personalized Federated Learning through Global Memorization
Machine Learning (CS)
Helps AI learn from phones without seeing your data.