LightTopoGAT: Enhancing Graph Attention Networks with Topological Features for Efficient Graph Classification
By: Ankit Sharma, Sayan Roy Gupta
Graph Neural Networks have demonstrated significant success in graph classification tasks, yet they often require substantial computational resources and struggle to capture global graph properties effectively. We introduce LightTopoGAT, a lightweight graph attention network that enhances node features through topological augmentation by incorporating node degree and local clustering coefficient to improve graph representation learning. The proposed approach maintains parameter efficiency through streamlined attention mechanisms while integrating structural information that is typically overlooked by local message passing schemes. Through comprehensive experiments on three benchmark datasets, MUTAG, ENZYMES, and PROTEINS, we show that LightTopoGAT achieves superior performance compared to established baselines including GCN, GraphSAGE, and standard GAT, with a 6.6 percent improvement in accuracy on MUTAG and a 2.2 percent improvement on PROTEINS. Ablation studies further confirm that these performance gains arise directly from the inclusion of topological features, demonstrating a simple yet effective strategy for enhancing graph neural network performance without increasing architectural complexity.
Similar Papers
Topologic Attention Networks: Attending to Direct and Indirect Neighbors through Gaussian Belief Propagation
Machine Learning (CS)
Lets computers understand complex connections faster.
Cross-View Topology-Aware Graph Representation Learning
Machine Learning (CS)
Helps computers understand complex data patterns better.
Quantum Graph Attention Network: A Novel Quantum Multi-Head Attention Mechanism for Graph Learning
Machine Learning (CS)
Makes computers learn from messy data faster.