Quantitative Flow Approximation Properties of Narrow Neural ODEs
By: Karthik Elamvazhuthi
Potential Business Impact:
Helps computers learn faster by simplifying how they think.
In this note, we revisit the problem of flow approximation properties of neural ordinary differential equations (NODEs). The approximation properties have been considered as a flow controllability problem in recent literature. The neural ODE is considered {\it narrow} when the parameters have dimension equal to the input of the neural network, and hence have limited width. We derive the relation of narrow NODEs in approximating flows of shallow but wide NODEs. Due to existing results on approximation properties of shallow neural networks, this facilitates understanding which kind of flows of dynamical systems can be approximated using narrow neural ODEs. While approximation properties of narrow NODEs have been established in literature, the proofs often involve extensive constructions or require invoking deep controllability theorems from control theory. In this paper, we provide a simpler proof technique that involves only ideas from ODEs and Gr{\"o}nwall's lemma. Moreover, we provide an estimate on the number of switches needed for the time dependent weights of the narrow NODE to mimic the behavior of a NODE with a single layer wide neural network as the velocity field.
Similar Papers
Approximation properties of neural ODEs
Numerical Analysis
Makes smart computer programs learn better.
Universal approximation property of neural stochastic differential equations
Probability
Neural networks can copy complex math problems.
Learning Optical Flow Field via Neural Ordinary Differential Equation
CV and Pattern Recognition
Makes computer vision better by learning smarter.