Score: 0

DFed-SST: Building Semantic- and Structure-aware Topologies for Decentralized Federated Graph Learning

Published: August 15, 2025 | arXiv ID: 2508.11530v1

By: Lianshuai Guo , Zhongzheng Yuan , Xunkai Li and more

Potential Business Impact:

Helps computers learn from scattered data better.

Decentralized Federated Learning (DFL) has emerged as a robust distributed paradigm that circumvents the single-point-of-failure and communication bottleneck risks of centralized architectures. However, a significant challenge arises as existing DFL optimization strategies, primarily designed for tasks such as computer vision, fail to address the unique topological information inherent in the local subgraph. Notably, while Federated Graph Learning (FGL) is tailored for graph data, it is predominantly implemented in a centralized server-client model, failing to leverage the benefits of decentralization.To bridge this gap, we propose DFed-SST, a decentralized federated graph learning framework with adaptive communication. The core of our method is a dual-topology adaptive communication mechanism that leverages the unique topological features of each client's local subgraph to dynamically construct and optimize the inter-client communication topology. This allows our framework to guide model aggregation efficiently in the face of heterogeneity. Extensive experiments on eight real-world datasets consistently demonstrate the superiority of DFed-SST, achieving 3.26% improvement in average accuracy over baseline methods.

Page Count
24 pages

Category
Computer Science:
Machine Learning (CS)