Score: 1

dFLMoE: Decentralized Federated Learning via Mixture of Experts for Medical Data Analysis

Published: March 13, 2025 | arXiv ID: 2503.10412v4

By: Luyuan Xie , Tianyu Luan , Wenyuan Cai and more

Potential Business Impact:

Lets hospitals share medical secrets safely.

Business Areas:
MOOC Education, Software

Federated learning has wide applications in the medical field. It enables knowledge sharing among different healthcare institutes while protecting patients' privacy. However, existing federated learning systems are typically centralized, requiring clients to upload client-specific knowledge to a central server for aggregation. This centralized approach would integrate the knowledge from each client into a centralized server, and the knowledge would be already undermined during the centralized integration before it reaches back to each client. Besides, the centralized approach also creates a dependency on the central server, which may affect training stability if the server malfunctions or connections are unstable. To address these issues, we propose a decentralized federated learning framework named dFLMoE. In our framework, clients directly exchange lightweight head models with each other. After exchanging, each client treats both local and received head models as individual experts, and utilizes a client-specific Mixture of Experts (MoE) approach to make collective decisions. This design not only reduces the knowledge damage with client-specific aggregations but also removes the dependency on the central server to enhance the robustness of the framework. We validate our framework on multiple medical tasks, demonstrating that our method evidently outperforms state-of-the-art approaches under both model homogeneity and heterogeneity settings.

Country of Origin
🇺🇸 United States

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)