BLIPs: Bayesian Learned Interatomic Potentials
By: Dario Coscia, Pim de Haan, Max Welling
Potential Business Impact:
Makes computer chemistry predictions more reliable.
Machine Learning Interatomic Potentials (MLIPs) are becoming a central tool in simulation-based chemistry. However, like most deep learning models, MLIPs struggle to make accurate predictions on out-of-distribution data or when trained in a data-scarce regime, both common scenarios in simulation-based chemistry. Moreover, MLIPs do not provide uncertainty estimates by construction, which are fundamental to guide active learning pipelines and to ensure the accuracy of simulation results compared to quantum calculations. To address this shortcoming, we propose BLIPs: Bayesian Learned Interatomic Potentials. BLIP is a scalable, architecture-agnostic variational Bayesian framework for training or fine-tuning MLIPs, built on an adaptive version of Variational Dropout. BLIP delivers well-calibrated uncertainty estimates and minimal computational overhead for energy and forces prediction at inference time, while integrating seamlessly with (equivariant) message-passing architectures. Empirical results on simulation-based computational chemistry tasks demonstrate improved predictive accuracy with respect to standard MLIPs, and trustworthy uncertainty estimates, especially in data-scarse or heavy out-of-distribution regimes. Moreover, fine-tuning pretrained MLIPs with BLIP yields consistent performance gains and calibrated uncertainties.
Similar Papers
DistMLIP: A Distributed Inference Platform for Machine Learning Interatomic Potentials
Distributed, Parallel, and Cluster Computing
Makes computer models of materials run much faster.
A practical guide to machine learning interatomic potentials -- Status and future
Materials Science
Helps scientists build better computer models of atoms.
Machine learning interatomic potential can infer electrical response
Materials Science
Predicts how materials react to electricity.