Learning collective variables that preserve transition rates
By: Shashank Sule, Arnav Mehta, Maria K. Cameron
Potential Business Impact:
Helps predict how molecules change shape faster.
Collective variables (CVs) play a crucial role in capturing rare events in high-dimensional systems, motivating the continual search for principled approaches to their design. In this work, we revisit the framework of quantitative coarse graining and identify the orthogonality condition from Legoll and Lelievre (2010) as a key criterion for constructing CVs that accurately preserve the statistical properties of the original process. We establish that satisfaction of the orthogonality condition enables error estimates for both relative entropy and pathwise distance to scale proportionally with the degree of scale separation. Building on this foundation, we introduce a general numerical method for designing neural network-based CVs that integrates tools from manifold learning with group-invariant featurization. To demonstrate the efficacy of our approach, we construct CVs for butane and achieve a CV that reproduces the anti-gauche transition rate with less than ten percent relative error. Additionally, we provide empirical evidence challenging the necessity of uniform positive definiteness in diffusion tensors for transition rate reproduction and highlight the critical role of light atoms in CV design for molecular dynamics.
Similar Papers
Learning Collective Variables from Time-lagged Generation
Machine Learning (CS)
Learns how molecules move faster for science.
Enhancing Diffusion-Based Sampling with Molecular Collective Variables
Chemical Physics
Finds new molecule shapes faster than before.
Neural Network Surrogates for Free Energy Computation of Complex Chemical Systems
Machine Learning (CS)
Helps scientists understand molecules better.