Cross-View Topology-Aware Graph Representation Learning
By: Ahmet Sami Korkmaz, Selim Coskunuzer, Md Joshem Uddin
Potential Business Impact:
Helps computers understand complex data patterns better.
Graph classification has gained significant attention due to its applications in chemistry, social networks, and bioinformatics. While Graph Neural Networks (GNNs) effectively capture local structural patterns, they often overlook global topological features that are critical for robust representation learning. In this work, we propose GraphTCL, a dual-view contrastive learning framework that integrates structural embeddings from GNNs with topological embeddings derived from persistent homology. By aligning these complementary views through a cross-view contrastive loss, our method enhances representation quality and improves classification performance. Extensive experiments on benchmark datasets, including TU and OGB molecular graphs, demonstrate that GraphTCL consistently outperforms state-of-the-art baselines. This study highlights the importance of topology-aware contrastive learning for advancing graph representation methods.
Similar Papers
Topological Feature Compression for Molecular Graph Neural Networks
Machine Learning (CS)
Finds better ways to build new materials.
Learning the Structure of Connection Graphs
Machine Learning (CS)
Finds hidden patterns in connected data.
GCL-GCN: Graphormer and Contrastive Learning Enhanced Attributed Graph Clustering Network
Machine Learning (CS)
Groups similar things together in complex data.