Federated Learning Framework via Distributed Mutual Learning
By: Yash Gupta
Potential Business Impact:
Lets computers learn together without sharing private data.
Federated Learning often relies on sharing full or partial model weights, which can burden network bandwidth and raise privacy risks. We present a loss-based alternative using distributed mutual learning. Instead of transmitting weights, clients periodically share their loss predictions on a public test set. Each client then refines its model by combining its local loss with the average Kullback-Leibler divergence over losses from other clients. This collaborative approach both reduces transmission overhead and preserves data privacy. Experiments on a face mask detection task demonstrate that our method outperforms weight-sharing baselines, achieving higher accuracy on unseen data while providing stronger generalization and privacy benefits.
Similar Papers
Online federated learning framework for classification
Machine Learning (Stat)
Teaches computers to learn from private data.
Federated Learning for Cross-Domain Data Privacy: A Distributed Approach to Secure Collaboration
Machine Learning (CS)
Keeps your private data safe when sharing.
A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss
Machine Learning (Stat)
Teaches computers to learn from private data better.