DFCA: Decentralized Federated Clustering Algorithm
By: Jonas Kirch , Sebastian Becker , Tiago Koketsu Rodrigues and more
Potential Business Impact:
Lets computers learn together without a boss.
Clustered Federated Learning has emerged as an effective approach for handling heterogeneous data across clients by partitioning them into clusters with similar or identical data distributions. However, most existing methods, including the Iterative Federated Clustering Algorithm (IFCA), rely on a central server to coordinate model updates, which creates a bottleneck and a single point of failure, limiting their applicability in more realistic decentralized learning settings. In this work, we introduce DFCA, a fully decentralized clustered FL algorithm that enables clients to collaboratively train cluster-specific models without central coordination. DFCA uses a sequential running average to aggregate models from neighbors as updates arrive, providing a communication-efficient alternative to batch aggregation while maintaining clustering performance. Our experiments on various datasets demonstrate that DFCA outperforms other decentralized algorithms and performs comparably to centralized IFCA, even under sparse connectivity, highlighting its robustness and practicality for dynamic real-world decentralized networks.
Similar Papers
LCFed: An Efficient Clustered Federated Learning Framework for Heterogeneous Data
Machine Learning (CS)
Makes AI learn better from different groups of data.
A new type of federated clustering: A non-model-sharing approach
Machine Learning (CS)
Lets groups learn from private data together.
On the Fast Adaptation of Delayed Clients in Decentralized Federated Learning: A Centroid-Aligned Distillation Approach
Machine Learning (CS)
Makes AI learn faster with less data sent.