ADMP-GNN: Adaptive Depth Message Passing GNN
By: Yassine Abbahaddou , Fragkiskos D. Malliaros , Johannes F. Lutzeyer and more
Potential Business Impact:
Lets computers learn better by adjusting how much they think.
Graph Neural Networks (GNNs) have proven to be highly effective in various graph learning tasks. A key characteristic of GNNs is their use of a fixed number of message-passing steps for all nodes in the graph, regardless of each node's diverse computational needs and characteristics. Through empirical real-world data analysis, we demonstrate that the optimal number of message-passing layers varies for nodes with different characteristics. This finding is further supported by experiments conducted on synthetic datasets. To address this, we propose Adaptive Depth Message Passing GNN (ADMP-GNN), a novel framework that dynamically adjusts the number of message passing layers for each node, resulting in improved performance. This approach applies to any model that follows the message passing scheme. We evaluate ADMP-GNN on the node classification task and observe performance improvements over baseline GNN models.
Similar Papers
Demystifying MPNNs: Message Passing as Merely Efficient Matrix Multiplication
Machine Learning (CS)
Explains how computer networks learn from connections.
Learning to accelerate distributed ADMM using graph neural networks
Machine Learning (CS)
Learns faster ways for computers to solve big problems.
An Active Diffusion Neural Network for Graphs
Machine Learning (CS)
Helps computers understand complex networks better.