Sheaf Graph Neural Networks via PAC-Bayes Spectral Optimization
By: Yoonhyuk Choi, Jiho Choi, Chong-Kwon Kim
Potential Business Impact:
Helps computers understand tricky connections in data.
Over-smoothing in Graph Neural Networks (GNNs) causes collapse in distinct node features, particularly on heterophilic graphs where adjacent nodes often have dissimilar labels. Although sheaf neural networks partially mitigate this problem, they typically rely on static or heavily parameterized sheaf structures that hinder generalization and scalability. Existing sheaf-based models either predefine restriction maps or introduce excessive complexity, yet fail to provide rigorous stability guarantees. In this paper, we introduce a novel scheme called SGPC (Sheaf GNNs with PAC-Bayes Calibration), a unified architecture that combines cellular-sheaf message passing with several mechanisms, including optimal transport-based lifting, variance-reduced diffusion, and PAC-Bayes spectral regularization for robust semi-supervised node classification. We establish performance bounds theoretically and demonstrate that the resulting bound-aware objective can be achieved via end-to-end training in linear computational complexity. Experiments on nine homophilic and heterophilic benchmarks show that SGPC outperforms state-of-the-art spectral and sheaf-based GNNs while providing certified confidence intervals on unseen nodes.
Similar Papers
Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report
Machine Learning (CS)
Makes smart computer graphs learn faster on huge data.
Polynomial Neural Sheaf Diffusion: A Spectral Filtering Approach on Cellular Sheaves
Machine Learning (CS)
Makes AI understand complex data better and faster.
Spectral Neural Graph Sparsification
Machine Learning (CS)
Makes computer models of networks faster and better.