Score: 1

Solving Over-Smoothing in GNNs via Nonlocal Message Passing: Algebraic Smoothing and Depth Scalability

Published: December 9, 2025 | arXiv ID: 2512.08475v1

By: Weiqi Guan, Junlin He

Potential Business Impact:

Makes computer learning models deeper and better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

The relationship between Layer Normalization (LN) placement and the over-smoothing phenomenon remains underexplored. We identify a critical dilemma: Pre-LN architectures avoid over-smoothing but suffer from the curse of depth, while Post-LN architectures bypass the curse of depth but experience over-smoothing. To resolve this, we propose a new method based on Post-LN that induces algebraic smoothing, preventing over-smoothing without the curse of depth. Empirical results across five benchmarks demonstrate that our approach supports deeper networks (up to 256 layers) and improves performance, requiring no additional parameters. Key contributions: Theoretical Characterization: Analysis of LN dynamics and their impact on over-smoothing and the curse of depth. A Principled Solution: A parameter-efficient method that induces algebraic smoothing and avoids over-smoothing and the curse of depth. Empirical Validation: Extensive experiments showing the effectiveness of the method in deeper GNNs.

Country of Origin
🇭🇰 🇨🇳 China, Hong Kong

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)