Score: 0

Negative Metric Learning for Graphs

Published: May 15, 2025 | arXiv ID: 2505.10307v1

By: Yiyang Zhao , Chengpei Wu , Lilin Zhang and more

Potential Business Impact:

Teaches computers to learn better from data.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Graph contrastive learning (GCL) often suffers from false negatives, which degrades the performance on downstream tasks. The existing methods addressing the false negative issue usually rely on human prior knowledge, still leading GCL to suboptimal results. In this paper, we propose a novel Negative Metric Learning (NML) enhanced GCL (NML-GCL). NML-GCL employs a learnable Negative Metric Network (NMN) to build a negative metric space, in which false negatives can be distinguished better from true negatives based on their distance to anchor node. To overcome the lack of explicit supervision signals for NML, we propose a joint training scheme with bi-level optimization objective, which implicitly utilizes the self-supervision signals to iteratively optimize the encoder and the negative metric network. The solid theoretical analysis and the extensive experiments conducted on widely used benchmarks verify the superiority of the proposed method.

Country of Origin
🇨🇳 China

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)