Topologic Attention Networks: Attending to Direct and Indirect Neighbors through Gaussian Belief Propagation
By: Marshall Rosenhoover, Huaming Zhang
Potential Business Impact:
Lets computers understand complex connections faster.
Graph Neural Networks rely on local message passing, which limits their ability to model long-range dependencies in graphs. Existing approaches extend this range through continuous-time dynamics or dense self-attention, but both suffer from high computational cost and limited scalability. We propose Topologic Attention Networks, a new framework that applies topologic attention, a probabilistic mechanism that learns how information should flow through both direct and indirect connections in a graph. Unlike conventional attention that depends on explicit pairwise interactions, topologic attention emerges from the learned information propagation of the graph, enabling unified reasoning over local and global relationships. This method achieves provides state-of-the-art performance across all measured baseline models. Our implementation is available at https://github.com/Marshall-Rosenhoover/Topologic-Attention-Networks.
Similar Papers
LightTopoGAT: Enhancing Graph Attention Networks with Topological Features for Efficient Graph Classification
Machine Learning (CS)
Makes computer "brains" understand pictures better.
Attention Beyond Neighborhoods: Reviving Transformer for Graph Clustering
Machine Learning (CS)
Helps computers group similar things by looking at connections.
Cross-View Topology-Aware Graph Representation Learning
Machine Learning (CS)
Helps computers understand complex data patterns better.