Score: 1

Rethinking Federated Graph Learning: A Data Condensation Perspective

Published: May 5, 2025 | arXiv ID: 2505.02573v1

By: Hao Zhang , Xunkai Li , Yinlin Zhu and more

Potential Business Impact:

Shares knowledge from many computers safely.

Business Areas:
Collaborative Consumption Collaboration

Federated graph learning is a widely recognized technique that promotes collaborative training of graph neural networks (GNNs) by multi-client graphs.However, existing approaches heavily rely on the communication of model parameters or gradients for federated optimization and fail to adequately address the data heterogeneity introduced by intricate and diverse graph distributions. Although some methods attempt to share additional messages among the server and clients to improve federated convergence during communication, they introduce significant privacy risks and increase communication overhead. To address these issues, we introduce the concept of a condensed graph as a novel optimization carrier to address FGL data heterogeneity and propose a new FGL paradigm called FedGM. Specifically, we utilize a generalized condensation graph consensus to aggregate comprehensive knowledge from distributed graphs, while minimizing communication costs and privacy risks through a single transmission of the condensed data. Extensive experiments on six public datasets consistently demonstrate the superiority of FedGM over state-of-the-art baselines, highlighting its potential for a novel FGL paradigm.

Country of Origin
🇨🇳 China

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)