Score: 0

On The Finetuning of MLIPs Through the Lens of Iterated Maps With BPTT

Published: November 30, 2025 | arXiv ID: 2512.01067v1

By: Evan Dramko , Yizhi Zhu , Aleksandar Krivokapic and more

Potential Business Impact:

Makes new materials faster and with less data.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Vital to the creation of advanced materials is performing structural relaxations. Traditional approaches built on physics-derived first-principles calculations are computationally expensive, motivating the creation of machine-learning interatomic potentials (MLIPs). Traditional approaches to training MLIPs for structural relaxations involves training models to faithfully reproduce first-principles computed forces. We propose a fine-tuning method to be used on a pretrained MLIP in which we create a fully-differentiable end-to-end simulation loop that optimizes the predicted final structures directly. Trajectories are unrolled and gradients are tracked through the entire relaxation. We show that this method achieves substantial performance gains when applied to pretrained models, leading to a nearly $50\%$ reduction in test error across the sample datasets. Interestingly, we show the process is robust to substantial variation in the relaxation setup, achieving negligibly different results across varied hyperparameter and procedural modifications. Experimental results indicate this is due to a ``preference'' of BPTT to modify the MLIP rather than the other trainable parameters. Of particular interest to practitioners is that this approach lowers the data requirements for producing an effective domain-specific MLIP, addressing a common bottleneck in practical deployment.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
15 pages

Category
Condensed Matter:
Materials Science