Score: 0

Iterative Pretraining Framework for Interatomic Potentials

Published: July 27, 2025 | arXiv ID: 2507.20118v1

By: Taoyong Cui , Zhongyao Wang , Dongzhan Zhou and more

Potential Business Impact:

Makes computer models of atoms faster and more accurate.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Machine learning interatomic potentials (MLIPs) enable efficient molecular dynamics (MD) simulations with ab initio accuracy and have been applied across various domains in physical science. However, their performance often relies on large-scale labeled training data. While existing pretraining strategies can improve model performance, they often suffer from a mismatch between the objectives of pretraining and downstream tasks or rely on extensive labeled datasets and increasingly complex architectures to achieve broad generalization. To address these challenges, we propose Iterative Pretraining for Interatomic Potentials (IPIP), a framework designed to iteratively improve the predictive performance of MLIP models. IPIP incorporates a forgetting mechanism to prevent iterative training from converging to suboptimal local minima. Unlike general-purpose foundation models, which frequently underperform on specialized tasks due to a trade-off between generality and system-specific accuracy, IPIP achieves higher accuracy and efficiency using lightweight architectures. Compared to general-purpose force fields, this approach achieves over 80% reduction in prediction error and up to 4x speedup in the challenging Mo-S-O system, enabling fast and accurate simulations.

Page Count
16 pages

Category
Physics:
Computational Physics