Thermodynamically Consistent Latent Dynamics Identification for Parametric Systems
By: Xiaolong He , Yeonjong Shin , Anthony Gruber and more
Potential Business Impact:
Makes computer models run much faster and smarter.
We propose an efficient thermodynamics-informed latent space dynamics identification (tLaSDI) framework for the reduced-order modeling of parametric nonlinear dynamical systems. This framework integrates autoencoders for dimensionality reduction with newly developed parametric GENERIC formalism-informed neural networks (pGFINNs), which enable efficient learning of parametric latent dynamics while preserving key thermodynamic principles such as free energy conservation and entropy generation across the parameter space. To further enhance model performance, a physics-informed active learning strategy is incorporated, leveraging a greedy, residual-based error indicator to adaptively sample informative training data, outperforming uniform sampling at equivalent computational cost. Numerical experiments on the Burgers' equation and the 1D/1V Vlasov-Poisson equation demonstrate that the proposed method achieves up to 3,528x speed-up with 1-3% relative errors, and significant reduction in training (50-90%) and inference (57-61%) cost. Moreover, the learned latent space dynamics reveal the underlying thermodynamic behavior of the system, offering valuable insights into the physical-space dynamics.
Similar Papers
Sequential decoder training for improved latent space dynamics identification
Machine Learning (CS)
Makes computer simulations faster and more accurate.
mLaSDI: Multi-stage latent space dynamics identification
Machine Learning (CS)
Makes computer models of science faster and better.
LAPD: Langevin-Assisted Bayesian Active Learning for Physical Discovery
Machine Learning (Stat)
Finds science rules with less data.