Consistency of Learned Sparse Grid Quadrature Rules using NeuralODEs
By: Hanno Gottschalk, Emil Partow, Tobias J. Riedlinger
Potential Business Impact:
Makes math problems with many parts easier to solve.
This paper provides a proof of the consistency of sparse grid quadrature for numerical integration of high dimensional distributions. In a first step, a transport map is learned that normalizes the distribution to a noise distribution on the unit cube. This step is built on the statistical learning theory of neural ordinary differential equations, which has been established recently. Secondly, the composition of the generative map with the quantity of interest is integrated numerically using the Clenshaw-Curtis sparse grid quadrature. A decomposition of the total numerical error in quadrature error and statistical error is provided. As main result it is proven in the framework of empirical risk minimization that all error terms can be controlled in the sense of PAC (probably approximately correct) learning and with high probability the numerical integral approximates the theoretical value up to an arbitrary small error in the limit where the data set size is growing and the network capacity is increased adaptively.
Similar Papers
Stochastic Quadrature Rules for Solving PDEs using Neural Networks
Numerical Analysis
Improves math computer programs for science.
Distribution learning via neural differential equations: minimal energy regularization and approximation theory
Machine Learning (CS)
Makes computers learn complex patterns from data.
Numerical and statistical analysis of NeuralODE with Runge-Kutta time integration
Machine Learning (CS)
Teaches computers to learn from data accurately.