Score: 0

Graph Attention for Heterogeneous Graphs with Positional Encoding

Published: April 3, 2025 | arXiv ID: 2504.02938v1

By: Nikhil Shivakumar Nayak

Potential Business Impact:

Makes computers better at understanding connected information.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Graph Neural Networks (GNNs) have emerged as the de facto standard for modeling graph data, with attention mechanisms and transformers significantly enhancing their performance on graph-based tasks. Despite these advancements, the performance of GNNs on heterogeneous graphs often remains complex, with networks generally underperforming compared to their homogeneous counterparts. This work benchmarks various GNN architectures to identify the most effective methods for heterogeneous graphs, with a particular focus on node classification and link prediction. Our findings reveal that graph attention networks excel in these tasks. As a main contribution, we explore enhancements to these attention networks by integrating positional encodings for node embeddings. This involves utilizing the full Laplacian spectrum to accurately capture both the relative and absolute positions of each node within the graph, further enhancing performance on downstream tasks such as node classification and link prediction.

Country of Origin
🇺🇸 United States

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)