Decentralized Federated Learning of Probabilistic Generative Classifiers
By: Aritz Pérez, Carlos Echegoyen, Guzmán Santafé
Potential Business Impact:
Lets computers learn together without sharing secrets.
Federated learning is a paradigm of increasing relevance in real world applications, aimed at building a global model across a network of heterogeneous users without requiring the sharing of private data. We focus on model learning over decentralized architectures, where users collaborate directly to update the global model without relying on a central server. In this context, the current paper proposes a novel approach to collaboratively learn probabilistic generative classifiers with a parametric form. The framework is composed by a communication network over a set of local nodes, each of one having its own local data, and a local updating rule. The proposal involves sharing local statistics with neighboring nodes, where each node aggregates the neighbors' information and iteratively learns its own local classifier, which progressively converges to a global model. Extensive experiments demonstrate that the algorithm consistently converges to a globally competitive model across a wide range of network topologies, network sizes, local dataset sizes, and extreme non-i.i.d. data distributions.
Similar Papers
Federated Learning on Stochastic Neural Networks
Machine Learning (CS)
Cleans up messy data for smarter AI.
Federated Learning with Discriminative Naive Bayes Classifier
Machine Learning (CS)
Keeps your private data safe while learning.
Asynchronous Personalized Federated Learning through Global Memorization
Machine Learning (CS)
Helps AI learn from phones without seeing your data.