Approximation properties of neural ODEs
By: Arturo De Marinis , Davide Murari , Elena Celledoni and more
Potential Business Impact:
Makes smart computer programs learn better.
We study the approximation properties of shallow neural networks whose activation function is defined as the flow of a neural ordinary differential equation (neural ODE) at the final time of the integration interval. We prove the universal approximation property (UAP) of such shallow neural networks in the space of continuous functions. Furthermore, we investigate the approximation properties of shallow neural networks whose parameters are required to satisfy some constraints. In particular, we constrain the Lipschitz constant of the flow of the neural ODE to increase the stability of the shallow neural network, and we restrict the norm of the weight matrices of the linear layers to one to make sure that the restricted expansivity of the flow is not compensated by the increased expansivity of the linear layers. For this setting, we prove approximation bounds that tell us the accuracy to which we can approximate a continuous function with a shallow neural network with such constraints. We prove that the UAP holds if we consider only the constraint on the Lipschitz constant of the flow or the unit norm constraint on the weight matrices of the linear layers.
Similar Papers
Quantitative Flow Approximation Properties of Narrow Neural ODEs
Optimization and Control
Helps computers learn faster by simplifying how they think.
Universal approximation property of neural stochastic differential equations
Probability
Neural networks can copy complex math problems.
The Influence of the Memory Capacity of Neural DDEs on the Universal Approximation Property
Dynamical Systems
Makes computers learn better with memory.