Adaptive Branch Specialization in Spectral-Spatial Graph Neural Networks for Certified Robustness
By: Yoonhyuk Choi, Jiho Choi, Chong-Kwon Kim
Potential Business Impact:
Makes AI smarter and harder to trick.
Recent Graph Neural Networks (GNNs) combine spectral-spatial architectures for enhanced representation learning. However, limited attention has been paid to certified robustness, particularly regarding training strategies and underlying rationale. In this paper, we explicitly specialize each branch: the spectral network is trained to withstand l0 edge flips and capture homophilic structures, while the spatial part is designed to resist linf feature perturbations and heterophilic patterns. A context-aware gating network adaptively fuses the two representations, dynamically routing each node's prediction to the more reliable branch. This specialized adversarial training scheme uses branch-specific inner maximization (structure vs feature attacks) and a unified alignment objective. We provide theoretical guarantees: (i) expressivity of the gating mechanism beyond 1-WL, (ii) spectral-spatial frequency bias, and (iii) certified robustness with trade-off. Empirically, SpecSphere attains state-of-the-art node classification accuracy and offers tighter certified robustness on real-world benchmarks.
Similar Papers
Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report
Machine Learning (CS)
Makes smart computer graphs learn faster on huge data.
Spectral Neural Graph Sparsification
Machine Learning (CS)
Makes computer models of networks faster and better.
Spectral Graph Neural Networks are Incomplete on Graphs with a Simple Spectrum
Machine Learning (CS)
Makes AI better at understanding graph-like data.