Maximizing Efficiency of Dataset Compression for Machine Learning Potentials With Information Theory
By: Benjamin Yu, Vincenzo Lordi, Daniel Schwalbe-Koda
Potential Business Impact:
Finds the most important data to make AI smarter.
Machine learning interatomic potentials (MLIPs) balance high accuracy and lower costs compared to density functional theory calculations, but their performance often depends on the size and diversity of training datasets. Large datasets improve model accuracy and generalization but are computationally expensive to produce and train on, while smaller datasets risk discarding rare but important atomic environments and compromising MLIP accuracy/reliability. Here, we develop an information-theoretical framework to quantify the efficiency of dataset compression methods and propose an algorithm that maximizes this efficiency. By framing atomistic dataset compression as an instance of the minimum set cover (MSC) problem over atom-centered environments, our method identifies the smallest subset of structures that contains as much information as possible from the original dataset while pruning redundant information. The approach is extensively demonstrated on the GAP-20 and TM23 datasets, and validated on 64 varied datasets from the ColabFit repository. Across all cases, MSC consistently retains outliers, preserves dataset diversity, and reproduces the long-tail distributions of forces even at high compression rates, outperforming other subsampling methods. Furthermore, MLIPs trained on MSC-compressed datasets exhibit reduced error for out-of-distribution data even in low-data regimes. We explain these results using an outlier analysis and show that such quantitative conclusions could not be achieved with conventional dimensionality reduction methods. The algorithm is implemented in the open-source QUESTS package and can be used for several tasks in atomistic modeling, from data subsampling, outlier detection, and training improved MLIPs at a lower cost.
Similar Papers
Comparing the latent features of universal machine-learning interatomic potentials
Chemical Physics
Helps computers understand how atoms connect better.
Composable and adaptive design of machine learning interatomic potentials guided by Fisher-information analysis
Materials Science
Makes computer models of atoms more accurate.
Active Learning Strategies for Efficient Machine-Learned Interatomic Potentials Across Diverse Material Systems
Machine Learning (CS)
Finds new materials faster with smart computer guesses.