Self-Supervised Learning on Molecular Graphs: A Systematic Investigation of Masking Design
By: Jiannan Yang, Veronika Thost, Tengfei Ma
Potential Business Impact:
Makes computers understand molecules better for new medicines.
Self-supervised learning (SSL) plays a central role in molecular representation learning. Yet, many recent innovations in masking-based pretraining are introduced as heuristics and lack principled evaluation, obscuring which design choices are genuinely effective. This work cast the entire pretrain-finetune workflow into a unified probabilistic framework, enabling a transparent comparison and deeper understanding of masking strategies. Building on this formalism, we conduct a controlled study of three core design dimensions: masking distribution, prediction target, and encoder architecture, under rigorously controlled settings. We further employ information-theoretic measures to assess the informativeness of pretraining signals and connect them to empirically benchmarked downstream performance. Our findings reveal a surprising insight: sophisticated masking distributions offer no consistent benefit over uniform sampling for common node-level prediction tasks. Instead, the choice of prediction target and its synergy with the encoder architecture are far more critical. Specifically, shifting to semantically richer targets yields substantial downstream improvements, particularly when paired with expressive Graph Transformer encoders. These insights offer practical guidance for developing more effective SSL methods for molecular graphs.
Similar Papers
Self-Supervised Dynamical System Representations for Physiological Time-Series
Machine Learning (CS)
Helps computers understand body signals better.
Selective Masking based Self-Supervised Learning for Image Semantic Segmentation
CV and Pattern Recognition
Teaches computers to see better by guessing missing parts.
Generative and Contrastive Graph Representation Learning
Machine Learning (CS)
Helps computers understand groups of connected things better.