GSAT: Graph Structure Attention Networks
By: Farshad Noravesh , Reza Haffari , Layki Soon and more
Potential Business Impact:
Helps computers understand complex data patterns better.
Graph Neural Networks (GNNs) have emerged as a powerful tool for processing data represented in graph structures, achieving remarkable success across a wide range of applications. However, to further improve the performance on graph classification benchmarks, structural representation of each node that encodes rich local topological information in the neighbourhood of nodes is an important type of feature that is often overlooked in the modeling. The consequence of neglecting the structural information has resulted high number of layers to connect messages from distant nodes which by itself produces other problems such as oversmoothing. In the present paper, we leverage these structural information that are modeled by anonymous random walks (ARWs) and introduce graph structure attention network (GSAT) which is a generalization of graph attention network(GAT) to integrate the original attribute and the structural representation to enforce the model to automatically find patterns for attending to different edges in the node neighbourhood to enrich graph representation. Our experiments show GSAT slightly improves SOTA on some graph classification benchmarks.
Similar Papers
SA-GAT-SR: Self-Adaptable Graph Attention Networks with Symbolic Regression for high-fidelity material property prediction
Computational Physics
Finds new materials by understanding how they work.
Beyond Attention: Learning Spatio-Temporal Dynamics with Emergent Interpretable Topologies
Machine Learning (CS)
Predicts future events better and faster.
Feature-based Graph Attention Networks Improve Online Continual Learning
CV and Pattern Recognition
Teaches computers to learn new things without forgetting.