Data-driven stochastic reduced-order modeling of parametrized dynamical systems
By: Andrew F. Ilersich, Kevin Course, Prasanth B. Nair
Modeling complex dynamical systems under varying conditions is computationally intensive, often rendering high-fidelity simulations intractable. Although reduced-order models (ROMs) offer a promising solution, current methods often struggle with stochastic dynamics and fail to quantify prediction uncertainty, limiting their utility in robust decision-making contexts. To address these challenges, we introduce a data-driven framework for learning continuous-time stochastic ROMs that generalize across parameter spaces and forcing conditions. Our approach, based on amortized stochastic variational inference, leverages a reparametrization trick for Markov Gaussian processes to eliminate the need for computationally expensive forward solvers during training. This enables us to jointly learn a probabilistic autoencoder and stochastic differential equations governing the latent dynamics, at a computational cost that is independent of the dataset size and system stiffness. Additionally, our approach offers the flexibility of incorporating physics-informed priors if available. Numerical studies are presented for three challenging test problems, where we demonstrate excellent generalization to unseen parameter combinations and forcings, and significant efficiency gains compared to existing approaches.
Similar Papers
Active learning for data-driven reduced models of parametric differential systems with Bayesian operator inference
Machine Learning (Stat)
Makes computer models learn better from less data.
RONOM: Reduced-Order Neural Operator Modeling
Machine Learning (CS)
Makes computer models of physics faster and more accurate.
Data-Driven Model Order Reduction for Continuous- and Discrete-Time Nonlinear Systems
Systems and Control
Makes robots learn without knowing how they work.