Federated Learning via Meta-Variational Dropout
By: Insu Jeon , Minui Hong , Junhyeog Yun and more
Potential Business Impact:
Helps AI learn from private data better.
Federated Learning (FL) aims to train a global inference model from remotely distributed clients, gaining popularity due to its benefit of improving data privacy. However, traditional FL often faces challenges in practical applications, including model overfitting and divergent local models due to limited and non-IID data among clients. To address these issues, we introduce a novel Bayesian meta-learning approach called meta-variational dropout (MetaVD). MetaVD learns to predict client-dependent dropout rates via a shared hypernetwork, enabling effective model personalization of FL algorithms in limited non-IID data settings. We also emphasize the posterior adaptation view of meta-learning and the posterior aggregation view of Bayesian FL via the conditional dropout posterior. We conducted extensive experiments on various sparse and non-IID FL datasets. MetaVD demonstrated excellent classification accuracy and uncertainty calibration performance, especially for out-of-distribution (OOD) clients. MetaVD compresses the local model parameters needed for each client, mitigating model overfitting and reducing communication costs. Code is available at https://github.com/insujeon/MetaVD.
Similar Papers
A Generalized Meta Federated Learning Framework with Theoretical Convergence Guarantees
Machine Learning (CS)
Helps AI learn better from many separate computers.
Boosting Generalization Performance in Model-Heterogeneous Federated Learning Using Variational Transposed Convolution
Machine Learning (CS)
Helps computers learn without sharing private data.
Neural Variational Dropout Processes
Machine Learning (CS)
Teaches computers to learn new tasks faster.