Iterative Pretraining Framework for Interatomic Potentials
By: Taoyong Cui , Zhongyao Wang , Dongzhan Zhou and more
Potential Business Impact:
Makes computer models of atoms faster and more accurate.
Machine learning interatomic potentials (MLIPs) enable efficient molecular dynamics (MD) simulations with ab initio accuracy and have been applied across various domains in physical science. However, their performance often relies on large-scale labeled training data. While existing pretraining strategies can improve model performance, they often suffer from a mismatch between the objectives of pretraining and downstream tasks or rely on extensive labeled datasets and increasingly complex architectures to achieve broad generalization. To address these challenges, we propose Iterative Pretraining for Interatomic Potentials (IPIP), a framework designed to iteratively improve the predictive performance of MLIP models. IPIP incorporates a forgetting mechanism to prevent iterative training from converging to suboptimal local minima. Unlike general-purpose foundation models, which frequently underperform on specialized tasks due to a trade-off between generality and system-specific accuracy, IPIP achieves higher accuracy and efficiency using lightweight architectures. Compared to general-purpose force fields, this approach achieves over 80% reduction in prediction error and up to 4x speedup in the challenging Mo-S-O system, enabling fast and accurate simulations.
Similar Papers
Energy & Force Regression on DFT Trajectories is Not Enough for Universal Machine Learning Interatomic Potentials
Materials Science
Finds new materials much faster.
Teacher-student training improves accuracy and efficiency of machine learning interatomic potentials
Chemical Physics
Makes computer models of atoms run faster.
Learning Smooth and Expressive Interatomic Potentials for Physical Property Prediction
Computational Physics
Makes computer models of materials more accurate.