Score: 3

Understanding and Tackling Over-Dilution in Graph Neural Networks

Published: August 22, 2025 | arXiv ID: 2508.16829v1

By: Junhyun Lee , Veronika Thost , Bumsoo Kim and more

BigTech Affiliations: IBM

Potential Business Impact:

Keeps computer learning from losing important details.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Message Passing Neural Networks (MPNNs) hold a key position in machine learning on graphs, but they struggle with unintended behaviors, such as over-smoothing and over-squashing, due to irregular data structures. The observation and formulation of these limitations have become foundational in constructing more informative graph representations. In this paper, we delve into the limitations of MPNNs, focusing on aspects that have previously been overlooked. Our observations reveal that even within a single layer, the information specific to an individual node can become significantly diluted. To delve into this phenomenon in depth, we present the concept of Over-dilution and formulate it with two dilution factors: intra-node dilution for attribute-level and inter-node dilution for node-level representations. We also introduce a transformer-based solution that alleviates over-dilution and complements existing node embedding methods like MPNNs. Our findings provide new insights and contribute to the development of informative representations. The implementation and supplementary materials are publicly available at https://github.com/LeeJunHyun/NATR.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡°πŸ‡· United States, Korea, Republic of

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)