On The Finetuning of MLIPs Through the Lens of Iterated Maps With BPTT
By: Evan Dramko , Yizhi Zhu , Aleksandar Krivokapic and more
Potential Business Impact:
Makes new materials faster and with less data.
Vital to the creation of advanced materials is performing structural relaxations. Traditional approaches built on physics-derived first-principles calculations are computationally expensive, motivating the creation of machine-learning interatomic potentials (MLIPs). Traditional approaches to training MLIPs for structural relaxations involves training models to faithfully reproduce first-principles computed forces. We propose a fine-tuning method to be used on a pretrained MLIP in which we create a fully-differentiable end-to-end simulation loop that optimizes the predicted final structures directly. Trajectories are unrolled and gradients are tracked through the entire relaxation. We show that this method achieves substantial performance gains when applied to pretrained models, leading to a nearly $50\%$ reduction in test error across the sample datasets. Interestingly, we show the process is robust to substantial variation in the relaxation setup, achieving negligibly different results across varied hyperparameter and procedural modifications. Experimental results indicate this is due to a ``preference'' of BPTT to modify the MLIP rather than the other trainable parameters. Of particular interest to practitioners is that this approach lowers the data requirements for producing an effective domain-specific MLIP, addressing a common bottleneck in practical deployment.
Similar Papers
Energy & Force Regression on DFT Trajectories is Not Enough for Universal Machine Learning Interatomic Potentials
Materials Science
Finds new materials much faster.
Iterative Pretraining Framework for Interatomic Potentials
Computational Physics
Makes computer models of atoms faster and more accurate.
BLIPs: Bayesian Learned Interatomic Potentials
Machine Learning (CS)
Makes computer chemistry predictions more reliable.