Self-Reinforced Graph Contrastive Learning
By: Chou-Ying Hsieh , Chun-Fu Jang , Cheng-En Hsieh and more
Potential Business Impact:
Improves computer understanding of linked information.
Graphs serve as versatile data structures in numerous real-world domains-including social networks, molecular biology, and knowledge graphs-by capturing intricate relational information among entities. Among graph-based learning techniques, Graph Contrastive Learning (GCL) has gained significant attention for its ability to derive robust, self-supervised graph representations through the contrasting of positive and negative sample pairs. However, a critical challenge lies in ensuring high-quality positive pairs so that the intrinsic semantic and structural properties of the original graph are preserved rather than distorted. To address this issue, we propose SRGCL (Self-Reinforced Graph Contrastive Learning), a novel framework that leverages the model's own encoder to dynamically evaluate and select high-quality positive pairs. We designed a unified positive pair generator employing multiple augmentation strategies, and a selector guided by the manifold hypothesis to maintain the underlying geometry of the latent space. By adopting a probabilistic mechanism for selecting positive pairs, SRGCL iteratively refines its assessment of pair quality as the encoder's representational power improves. Extensive experiments on diverse graph-level classification tasks demonstrate that SRGCL, as a plug-in module, consistently outperforms state-of-the-art GCL methods, underscoring its adaptability and efficacy across various domains.
Similar Papers
A Generative Graph Contrastive Learning Model with Global Signal
Machine Learning (CS)
Makes computer learning from networks smarter.
Model-Driven Graph Contrastive Learning
Machine Learning (CS)
Teaches computers to understand data better.
Supervised Graph Contrastive Learning for Gene Regulatory Networks
Machine Learning (CS)
Helps predict cancer survival from gene networks.