Score: 0

Graph Structure Learning with Privacy Guarantees for Open Graph Data

Published: July 25, 2025 | arXiv ID: 2507.19116v1

By: Muhao Guo , Jiaqi Wu , Yang Weng and more

Potential Business Impact:

Keeps private info safe when sharing data.

Ensuring privacy in large-scale open datasets is increasingly challenging under regulations such as the General Data Protection Regulation (GDPR). While differential privacy (DP) provides strong theoretical guarantees, it primarily focuses on noise injection during model training, neglecting privacy preservation at the data publishing stage. Existing privacy-preserving data publishing (PPDP) approaches struggle to balance privacy and utility, particularly when data publishers and users are distinct entities. To address this gap, we focus on the graph recovery problem and propose a novel privacy-preserving estimation framework for open graph data, leveraging Gaussian DP (GDP) with a structured noise-injection mechanism. Unlike traditional methods that perturb gradients or model updates, our approach ensures unbiased graph structure recovery while enforcing DP at the data publishing stage. Moreover, we provide theoretical guarantees on estimation accuracy and extend our method to discrete-variable graphs, a setting often overlooked in DP research. Experimental results in graph learning demonstrate robust performance, offering a viable solution for privacy-conscious graph analysis.

Country of Origin
🇺🇸 United States

Page Count
32 pages

Category
Computer Science:
Machine Learning (CS)