Learning the Simplest Neural ODE
By: Yuji Okamoto, Tomoya Takeuchi, Yusuke Sakemi
Potential Business Impact:
Makes it easier to teach computers about changing things.
Since the advent of the ``Neural Ordinary Differential Equation (Neural ODE)'' paper, learning ODEs with deep learning has been applied to system identification, time-series forecasting, and related areas. Exploiting the diffeomorphic nature of ODE solution maps, neural ODEs has also enabled their use in generative modeling. Despite the rich potential to incorporate various kinds of physical information, training Neural ODEs remains challenging in practice. This study demonstrates, through the simplest one-dimensional linear model, why training Neural ODEs is difficult. We then propose a new stabilization method and provide an analytical convergence analysis. The insights and techniques presented here serve as a concise tutorial for researchers beginning work on Neural ODEs.
Similar Papers
Symbolic Neural Ordinary Differential Equations
Machine Learning (CS)
Teaches computers to understand how things change.
Learning Optical Flow Field via Neural Ordinary Differential Equation
CV and Pattern Recognition
Makes computer vision better by learning smarter.
Neural ODE Transformers: Analyzing Internal Dynamics and Adaptive Fine-tuning
Machine Learning (CS)
Makes AI understand itself better.