RRAEDy: Adaptive Latent Linearization of Nonlinear Dynamical Systems
By: Jad Mounayer , Sebastian Rodriguez , Jerome Tomezyk and more
Potential Business Impact:
Finds hidden patterns in moving things.
Most existing latent-space models for dynamical systems require fixing the latent dimension in advance, they rely on complex loss balancing to approximate linear dynamics, and they don't regularize the latent variables. We introduce RRAEDy, a model that removes these limitations by discovering the appropriate latent dimension, while enforcing both regularized and linearized dynamics in the latent space. Built upon Rank-Reduction Autoencoders (RRAEs), RRAEDy automatically rank and prune latent variables through their singular values while learning a latent Dynamic Mode Decomposition (DMD) operator that governs their temporal progression. This structure-free yet linearly constrained formulation enables the model to learn stable and low-dimensional dynamics without auxiliary losses or manual tuning. We provide theoretical analysis demonstrating the stability of the learned operator and showcase the generality of our model by proposing an extension that handles parametric ODEs. Experiments on canonical benchmarks, including the Van der Pol oscillator, Burgers' equation, 2D Navier-Stokes, and Rotating Gaussians, show that RRAEDy achieves accurate and robust predictions. Our code is open-source and available at https://github.com/JadM133/RRAEDy. We also provide a video summarizing the main results at https://youtu.be/ox70mSSMGrM.
Similar Papers
Latent Diffeomorphic Dynamic Mode Decomposition
Machine Learning (CS)
Predicts water flow using smart math.
Variational Rank Reduction Autoencoder
Machine Learning (CS)
Makes AI create better, more realistic pictures.
Thermodynamically Consistent Latent Dynamics Identification for Parametric Systems
Machine Learning (CS)
Makes computer models run much faster and smarter.