Message Passing on the Edge: Towards Scalable and Expressive GNNs
By: Pablo Barceló , Fabian Jogl , Alexander Kozachinskiy and more
Potential Business Impact:
Makes computers understand complex data patterns faster.
We propose EB-1WL, an edge-based color-refinement test, and a corresponding GNN architecture, EB-GNN. Our architecture is inspired by a classic triangle counting algorithm by Chiba and Nishizeki, and explicitly uses triangles during message passing. We achieve the following results: (1)~EB-1WL is significantly more expressive than 1-WL. Further, we provide a complete logical characterization of EB-1WL based on first-order logic, and matching distinguishability results based on homomorphism counting. (2)~In an important distinction from previous proposals for more expressive GNN architectures, EB-1WL and EB-GNN require near-linear time and memory on practical graph learning tasks. (3)~Empirically, we show that EB-GNN is a highly-efficient general-purpose architecture: It substantially outperforms simple MPNNs, and remains competitive with task-specialized GNNs while being significantly more computationally efficient.
Similar Papers
Weisfeiler-Lehman meets Events: An Expressivity Analysis for Continuous-Time Dynamic Graph Neural Networks
Machine Learning (CS)
Lets computers understand changing, messy real-world networks.
What Expressivity Theory Misses: Message Passing Complexity for GNNs
Machine Learning (CS)
Measures how well computer networks learn from data.
Repetition Makes Perfect: Recurrent Graph Neural Networks Match Message Passing Limit
Machine Learning (CS)
Lets computers understand any connected graph problem.