Predicting symbolic ODEs from multiple trajectories
By: Yakup Emre Şahin, Niki Kilbertus, Sören Becker
Potential Business Impact:
Finds hidden math rules from watching things move.
We introduce MIO, a transformer-based model for inferring symbolic ordinary differential equations (ODEs) from multiple observed trajectories of a dynamical system. By combining multiple instance learning with transformer-based symbolic regression, the model effectively leverages repeated observations of the same system to learn more generalizable representations of the underlying dynamics. We investigate different instance aggregation strategies and show that even simple mean aggregation can substantially boost performance. MIO is evaluated on systems ranging from one to four dimensions and under varying noise levels, consistently outperforming existing baselines.
Similar Papers
Towards Foundation Inference Models that Learn ODEs In-Context
Machine Learning (CS)
Helps computers learn how things change from messy data.
On Approaches to Building Surrogate ODE Models for Diffusion Bridges
Machine Learning (CS)
Makes AI create images much faster and easier.
Neural ODE Transformers: Analyzing Internal Dynamics and Adaptive Fine-tuning
Machine Learning (CS)
Makes AI understand itself better.