Measuring Over-smoothing beyond Dirichlet energy
By: Weiqi Guan, Zihao Shi
Potential Business Impact:
Finds when AI models get too confused.
While Dirichlet energy serves as a prevalent metric for quantifying over-smoothing, it is inherently restricted to capturing first-order feature derivatives. To address this limitation, we propose a generalized family of node similarity measures based on the energy of higher-order feature derivatives. Through a rigorous theoretical analysis of the relationships among these measures, we establish the decay rates of Dirichlet energy under both continuous heat diffusion and discrete aggregation operators. Furthermore, our analysis reveals an intrinsic connection between the over-smoothing decay rate and the spectral gap of the graph Laplacian. Finally, empirical results demonstrate that attention-based Graph Neural Networks (GNNs) suffer from over-smoothing when evaluated under these proposed metrics.
Similar Papers
Comment on "A Note on Over-Smoothing for Graph Neural Networks"
Machine Learning (CS)
Fixes "fuzzy" computer learning for better results.
An Active Diffusion Neural Network for Graphs
Machine Learning (CS)
Helps computers understand complex networks better.
Learning Graph from Smooth Signals under Partial Observation: A Robustness Analysis
Machine Learning (CS)
Finds hidden connections in networks even with missing data.