Flow-Attentional Graph Neural Networks
By: Pascal Plettenberg , Dominik Köhler , Bernhard Sick and more
Potential Business Impact:
Makes smart computers understand power grids better.
Graph Neural Networks (GNNs) have become essential for learning from graph-structured data. However, existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources, such as electrical current in power grids or traffic in transportation networks, which can lead to reduced model performance. To address this, we propose flow attention, which adapts existing graph attention mechanisms to satisfy Kirchhoff\'s first law. Furthermore, we discuss how this modification influences the expressivity and identify sets of non-isomorphic graphs that can be discriminated by flow attention but not by standard attention. Through extensive experiments on two flow graph datasets (electronic circuits and power grids), we demonstrate that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.
Similar Papers
Topology-aware Neural Flux Prediction Guided by Physics
Machine Learning (CS)
Helps computers understand traffic flow better.
When Does Global Attention Help? A Unified Empirical Study on Atomistic Graph Learning
Machine Learning (CS)
Helps computers predict material properties faster.
Generalizable Graph Neural Networks for Robust Power Grid Topology Control
Machine Learning (CS)
Makes power grids smarter and more reliable.