Towards Fast Coarse-graining and Equation Discovery with Foundation Inference Models
By: Manuel Hinz , Maximilian Mauel , Patrick Seifner and more
Potential Business Impact:
Finds hidden patterns in moving pictures.
High-dimensional recordings of dynamical processes are often characterized by a much smaller set of effective variables, evolving on low-dimensional manifolds. Identifying these latent dynamics requires solving two intertwined problems: discovering appropriate coarse-grained variables and simultaneously fitting the governing equations. Most machine learning approaches tackle these tasks jointly by training autoencoders together with models that enforce dynamical consistency. We propose to decouple the two problems by leveraging the recently introduced Foundation Inference Models (FIMs). FIMs are pretrained models that estimate the infinitesimal generators of dynamical systems (e.g., the drift and diffusion of a stochastic differential equation) in zero-shot mode. By amortizing the inference of the dynamics through a FIM with frozen weights, and training only the encoder-decoder map, we define a simple, simulation-consistent loss that stabilizes representation learning. A proof of concept on a stochastic double-well system with semicircle diffusion, embedded into synthetic video data, illustrates the potential of this approach for fast and reusable coarse-graining pipelines.
Similar Papers
Towards Foundation Inference Models that Learn ODEs In-Context
Machine Learning (CS)
Helps computers learn how things change from messy data.
Data-Efficient Symbolic Regression via Foundation Model Distillation
Machine Learning (CS)
Finds hidden science rules from few examples.
Ideas in Inference-time Scaling can Benefit Generative Pre-training Algorithms
Machine Learning (CS)
Makes AI understand pictures and words faster.