DistMLIP: A Distributed Inference Platform for Machine Learning Interatomic Potentials
By: Kevin Han , Bowen Deng , Amir Barati Farimani and more
Potential Business Impact:
Makes computer models of materials run much faster.
Large-scale atomistic simulations are essential to bridge computational materials and chemistry to realistic materials and drug discovery applications. In the past few years, rapid developments of machine learning interatomic potentials (MLIPs) have offered a solution to scale up quantum mechanical calculations. Parallelizing these interatomic potentials across multiple devices poses a challenging, but promising approach to further extending simulation scales to real-world applications. In this work, we present DistMLIP, an efficient distributed inference platform for MLIPs based on zero-redundancy, graph-level parallelization. In contrast to conventional space-partitioning parallelization, DistMLIP enables efficient MLIP parallelization through graph partitioning, allowing multi-device inference on flexible MLIP model architectures like multi-layer graph neural networks. DistMLIP presents an easy-to-use, flexible, plug-in interface that enables distributed inference of pre-existing MLIPs. We demonstrate DistMLIP on four widely used and state-of-the-art MLIPs: CHGNet, MACE, TensorNet, and eSEN. We show that existing foundational potentials can perform near-million-atom calculations at the scale of a few seconds on 8 GPUs with DistMLIP.
Similar Papers
BLIPs: Bayesian Learned Interatomic Potentials
Machine Learning (CS)
Makes computer chemistry predictions more reliable.
Equivariant Machine Learning Interatomic Potentials with Global Charge Redistribution
Chemical Physics
Predicts how atoms connect, even far apart.
A practical guide to machine learning interatomic potentials -- Status and future
Materials Science
Helps scientists build better computer models of atoms.