Score: 0

Clustered Federated Learning with Hierarchical Knowledge Distillation

Published: December 11, 2025 | arXiv ID: 2512.10443v1

By: Sabtain Ahmad , Meerzhan Kanatbekova , Ivona Brandic and more

Potential Business Impact:

Helps smart devices learn together without sharing private data.

Business Areas:
Cloud Computing Internet Services, Software

Clustered Federated Learning (CFL) has emerged as a powerful approach for addressing data heterogeneity and ensuring privacy in large distributed IoT environments. By clustering clients and training cluster-specific models, CFL enables personalized models tailored to groups of heterogeneous clients. However, conventional CFL approaches suffer from fragmented learning for training independent global models for each cluster and fail to take advantage of collective cluster insights. This paper advocates a shift to hierarchical CFL, allowing bi-level aggregation to train cluster-specific models at the edge and a unified global model at the cloud. This shift improves training efficiency yet might introduce communication challenges. To this end, we propose CFLHKD, a novel personalization scheme for integrating hierarchical cluster knowledge into CFL. Built upon multi-teacher knowledge distillation, CFLHKD enables inter-cluster knowledge sharing while preserving cluster-specific personalization. CFLHKD adopts a bi-level aggregation to bridge the gap between local and global learning. Extensive evaluations of standard benchmark datasets demonstrate that CFLHKD outperforms representative baselines in cluster-specific and global model accuracy and achieves a performance improvement of 3.32-7.57\%.

Country of Origin
🇦🇹 Austria

Page Count
12 pages

Category
Computer Science:
Distributed, Parallel, and Cluster Computing