Score: 0

Federated Learning Framework via Distributed Mutual Learning

Published: March 3, 2025 | arXiv ID: 2503.05803v1

By: Yash Gupta

Potential Business Impact:

Lets computers learn together without sharing private data.

Business Areas:
Collaborative Consumption Collaboration

Federated Learning often relies on sharing full or partial model weights, which can burden network bandwidth and raise privacy risks. We present a loss-based alternative using distributed mutual learning. Instead of transmitting weights, clients periodically share their loss predictions on a public test set. Each client then refines its model by combining its local loss with the average Kullback-Leibler divergence over losses from other clients. This collaborative approach both reduces transmission overhead and preserves data privacy. Experiments on a face mask detection task demonstrate that our method outperforms weight-sharing baselines, achieving higher accuracy on unseen data while providing stronger generalization and privacy benefits.

Country of Origin
🇨🇦 Canada

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)