Score: 1

Short-Range Oversquashing

Published: November 25, 2025 | arXiv ID: 2511.20406v1

By: Yaaqov Mishayev , Yonatan Sverdlov , Tal Amir and more

Potential Business Impact:

Makes computers understand complex information better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Message Passing Neural Networks (MPNNs) are widely used for learning on graphs, but their ability to process long-range information is limited by the phenomenon of oversquashing. This limitation has led some researchers to advocate Graph Transformers as a better alternative, whereas others suggest that it can be mitigated within the MPNN framework, using virtual nodes or other rewiring techniques. In this work, we demonstrate that oversquashing is not limited to long-range tasks, but can also arise in short-range problems. This observation allows us to disentangle two distinct mechanisms underlying oversquashing: (1) the bottleneck phenomenon, which can arise even in low-range settings, and (2) the vanishing gradient phenomenon, which is closely associated with long-range tasks. We further show that the short-range bottleneck effect is not captured by existing explanations for oversquashing, and that adding virtual nodes does not resolve it. In contrast, transformers do succeed in such tasks, positioning them as the more compelling solution to oversquashing, compared to specialized MPNNs.

Country of Origin
🇮🇱 Israel

Repos / Data Links

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)