Ensemble Knowledge Distillation for Machine Learning Interatomic Potentials
By: Sakib Matin , Emily Shinkle , Yulia Pimonova and more
Potential Business Impact:
Teaches computers to predict material behavior accurately.
The quality of machine learning interatomic potentials (MLIPs) strongly depends on the quantity of training data as well as the quantum chemistry (QC) level of theory used. Datasets generated with high-fidelity QC methods are typically restricted to small molecules and may be missing energy gradients, which make it difficult to train accurate MLIPs. We present an ensemble knowledge distillation (EKD) method to improve MLIP accuracy when trained to energy-only datasets. First, multiple teacher models are trained to QC energies and then generate atomic forces for all configurations in the dataset. Next, the student MLIP is trained to both QC energies and to ensemble-averaged forces generated by the teacher models. We apply this workflow on the ANI-1ccx dataset where the configuration energies computed at the coupled cluster level of theory. The resulting student MLIPs achieve new state-of-the-art accuracy on the COMP6 benchmark and show improved stability for molecular dynamics simulations.
Similar Papers
Equivariant Machine Learning Interatomic Potentials with Global Charge Redistribution
Chemical Physics
Predicts how atoms connect, even far apart.
Knowledge Distillation Framework for Accelerating High-Accuracy Neural Network-Based Molecular Dynamics Simulations
Machine Learning (CS)
Makes computer models of materials run faster.
DistMLIP: A Distributed Inference Platform for Machine Learning Interatomic Potentials
Distributed, Parallel, and Cluster Computing
Makes computer models of materials run much faster.