FedBEns: One-Shot Federated Learning based on Bayesian Ensemble
By: Jacopo Talpini, Marco Savi, Giovanni Neglia
Potential Business Impact:
Helps computers learn from many sources at once.
One-Shot Federated Learning (FL) is a recent paradigm that enables multiple clients to cooperatively learn a global model in a single round of communication with a central server. In this paper, we analyze the One-Shot FL problem through the lens of Bayesian inference and propose FedBEns, an algorithm that leverages the inherent multimodality of local loss functions to find better global models. Our algorithm leverages a mixture of Laplace approximations for the clients' local posteriors, which the server then aggregates to infer the global model. We conduct extensive experiments on various datasets, demonstrating that the proposed method outperforms competing baselines that typically rely on unimodal approximations of the local losses.
Similar Papers
One-Shot Clustering for Federated Learning
Machine Learning (CS)
Finds best time to group devices for learning.
One-Shot Clustering for Federated Learning Under Clustering-Agnostic Assumption
Machine Learning (CS)
Finds best groups for personalized AI.
Client Selection in Federated Learning with Data Heterogeneity and Network Latencies
Machine Learning (CS)
Makes smart computers learn faster from different data.