Score: 1

Self-Supervised Learning on Molecular Graphs: A Systematic Investigation of Masking Design

Published: December 8, 2025 | arXiv ID: 2512.07064v1

By: Jiannan Yang, Veronika Thost, Tengfei Ma

BigTech Affiliations: IBM

Potential Business Impact:

Makes computers understand molecules better for new medicines.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Self-supervised learning (SSL) plays a central role in molecular representation learning. Yet, many recent innovations in masking-based pretraining are introduced as heuristics and lack principled evaluation, obscuring which design choices are genuinely effective. This work cast the entire pretrain-finetune workflow into a unified probabilistic framework, enabling a transparent comparison and deeper understanding of masking strategies. Building on this formalism, we conduct a controlled study of three core design dimensions: masking distribution, prediction target, and encoder architecture, under rigorously controlled settings. We further employ information-theoretic measures to assess the informativeness of pretraining signals and connect them to empirically benchmarked downstream performance. Our findings reveal a surprising insight: sophisticated masking distributions offer no consistent benefit over uniform sampling for common node-level prediction tasks. Instead, the choice of prediction target and its synergy with the encoder architecture are far more critical. Specifically, shifting to semantically richer targets yields substantial downstream improvements, particularly when paired with expressive Graph Transformer encoders. These insights offer practical guidance for developing more effective SSL methods for molecular graphs.

Country of Origin
🇺🇸 United States

Page Count
32 pages

Category
Computer Science:
Machine Learning (CS)