Online federated learning framework for classification
By: Wenxing Guo , Jinhan Xie , Jianya Lu and more
Potential Business Impact:
Teaches computers to learn from private data.
In this paper, we develop a novel online federated learning framework for classification, designed to handle streaming data from multiple clients while ensuring data privacy and computational efficiency. Our method leverages the generalized distance-weighted discriminant technique, making it robust to both homogeneous and heterogeneous data distributions across clients. In particular, we develop a new optimization algorithm based on the Majorization-Minimization principle, integrated with a renewable estimation procedure, enabling efficient model updates without full retraining. We provide a theoretical guarantee for the convergence of our estimator, proving its consistency and asymptotic normality under standard regularity conditions. In addition, we establish that our method achieves Bayesian risk consistency, ensuring its reliability for classification tasks in federated environments. We further incorporate differential privacy mechanisms to enhance data security, protecting client information while maintaining model performance. Extensive numerical experiments on both simulated and real-world datasets demonstrate that our approach delivers high classification accuracy, significant computational efficiency gains, and substantial savings in data storage requirements compared to existing methods.
Similar Papers
Federated Online Learning for Heterogeneous Multisource Streaming Data
Machine Learning (Stat)
Learns from many computers without sharing private data.
Federated Learning Framework via Distributed Mutual Learning
Machine Learning (CS)
Lets computers learn together without sharing private data.
Financial Data Analysis with Robust Federated Logistic Regression
Machine Learning (CS)
Keeps money data safe while learning from it.