Score: 1

Solving Oversmoothing in GNNs via Nonlocal Message Passing: Algebraic Smoothing and Depth Scalability

Published: December 9, 2025 | arXiv ID: 2512.08475v2

By: Weiqi Guan, Junlin He

Potential Business Impact:

Makes computer brains learn much more, much deeper.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

The relationship between Layer Normalization (LN) placement and the oversmoothing phenomenon remains underexplored. We identify a critical dilemma: Pre-LN architectures avoid oversmoothing but suffer from the curse of depth, while Post-LN architectures bypass the curse of depth but experience oversmoothing. To resolve this, we propose a new method based on Post-LN that induces algebraic smoothing, preventing oversmoothing without the curse of depth. Empirical results across five benchmarks demonstrate that our approach supports deeper networks (up to 256 layers) and improves performance, requiring no additional parameters. Key contributions: Theoretical Characterization: Analysis of LN dynamics and their impact on oversmoothing and the curse of depth. A Principled Solution: A parameter-efficient method that induces algebraic smoothing and avoids oversmoothing and the curse of depth. Empirical Validation: Extensive experiments showing the effectiveness of the method in deeper GNNs.

Country of Origin
🇭🇰 🇨🇳 China, Hong Kong

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)