Score: 1

Measuring Over-smoothing beyond Dirichlet energy

Published: December 7, 2025 | arXiv ID: 2512.06782v1

By: Weiqi Guan, Zihao Shi

Potential Business Impact:

Finds when AI models get too confused.

Business Areas:
Energy Management Energy

While Dirichlet energy serves as a prevalent metric for quantifying over-smoothing, it is inherently restricted to capturing first-order feature derivatives. To address this limitation, we propose a generalized family of node similarity measures based on the energy of higher-order feature derivatives. Through a rigorous theoretical analysis of the relationships among these measures, we establish the decay rates of Dirichlet energy under both continuous heat diffusion and discrete aggregation operators. Furthermore, our analysis reveals an intrinsic connection between the over-smoothing decay rate and the spectral gap of the graph Laplacian. Finally, empirical results demonstrate that attention-based Graph Neural Networks (GNNs) suffer from over-smoothing when evaluated under these proposed metrics.

Country of Origin
🇨🇳 China

Page Count
17 pages

Category
Computer Science:
Machine Learning (CS)