Score: 1

A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss

Published: April 6, 2025 | arXiv ID: 2504.04642v1

By: Hengrui Hu, Anai N. Kothari, Anjishnu Banerjee

Potential Business Impact:

Teaches computers to learn from private data better.

Business Areas:
Personalization Commerce and Shopping

Federated learning (FL) offers a privacy-preserving framework for distributed machine learning, enabling collaborative model training across diverse clients without centralizing sensitive data. However, statistical heterogeneity, characterized by non-independent and identically distributed (non-IID) client data, poses significant challenges, leading to model drift and poor generalization. This paper proposes a novel algorithm, pFedKD-WCL (Personalized Federated Knowledge Distillation with Weighted Combination Loss), which integrates knowledge distillation with bi-level optimization to address non-IID challenges. pFedKD-WCL leverages the current global model as a teacher to guide local models, optimizing both global convergence and local personalization efficiently. We evaluate pFedKD-WCL on the MNIST dataset and a synthetic dataset with non-IID partitioning, using multinomial logistic regression and multilayer perceptron models. Experimental results demonstrate that pFedKD-WCL outperforms state-of-the-art algorithms, including FedAvg, FedProx, Per-FedAvg, and pFedMe, in terms of accuracy and convergence speed.

Country of Origin
🇺🇸 United States

Page Count
9 pages

Category
Statistics:
Machine Learning (Stat)