Mitigating Over-Squashing in Graph Neural Networks by Spectrum-Preserving Sparsification
By: Langzhang Liang , Fanchen Bu , Zixing Song and more
Potential Business Impact:
Helps computers understand complex data better.
The message-passing paradigm of Graph Neural Networks often struggles with exchanging information across distant nodes typically due to structural bottlenecks in certain graph regions, a limitation known as \textit{over-squashing}. To reduce such bottlenecks, \textit{graph rewiring}, which modifies graph topology, has been widely used. However, existing graph rewiring techniques often overlook the need to preserve critical properties of the original graph, e.g., \textit{spectral properties}. Moreover, many approaches rely on increasing edge count to improve connectivity, which introduces significant computational overhead and exacerbates the risk of over-smoothing. In this paper, we propose a novel graph rewiring method that leverages \textit{spectrum-preserving} graph \textit{sparsification}, for mitigating over-squashing. Our method generates graphs with enhanced connectivity while maintaining sparsity and largely preserving the original graph spectrum, effectively balancing structural bottleneck reduction and graph property preservation. Experimental results validate the effectiveness of our approach, demonstrating its superiority over strong baseline methods in classification accuracy and retention of the Laplacian spectrum.
Similar Papers
Spectral Neural Graph Sparsification
Machine Learning (CS)
Makes computer models of networks faster and better.
Over-Squashing in GNNs and Causal Inference of Rewiring Strategies
Machine Learning (CS)
Fixes computer learning problems in complex data.
Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report
Machine Learning (CS)
Makes smart computer graphs learn faster on huge data.