How deep is your network? Deep vs. shallow learning of transfer operators
By: Mohammad Tabish, Benedict Leimkuhler, Stefan Klus
Potential Business Impact:
Teaches computers complex patterns faster, with fewer errors.
We propose a randomized neural network approach called RaNNDy for learning transfer operators and their spectral decompositions from data. The weights of the hidden layers of the neural network are randomly selected and only the output layer is trained. The main advantage is that without a noticeable reduction in accuracy, this approach significantly reduces the training time and resources while avoiding common problems associated with deep learning such as sensitivity to hyperparameters and slow convergence. Additionally, the proposed framework allows us to compute a closed-form solution for the output layer which directly represents the eigenfunctions of the operator. Moreover, it is possible to estimate uncertainties associated with the computed spectral properties via ensemble learning. We present results for different dynamical operators, including Koopman and Perron-Frobenius operators, which have important applications in analyzing the behavior of complex dynamical systems, and the Schr\"odinger operator. The numerical examples, which highlight the strengths but also weaknesses of the proposed framework, include several stochastic dynamical systems, protein folding processes, and the quantum harmonic oscillator.
Similar Papers
DeepONet Augmented by Randomized Neural Networks for Efficient Operator Learning in PDEs
Machine Learning (CS)
Solves hard math problems much faster.
Cauchy Random Features for Operator Learning in Sobolev Space
Machine Learning (CS)
Teaches computers to learn math faster.
The Spectral Bias of Shallow Neural Network Learning is Shaped by the Choice of Non-linearity
Machine Learning (CS)
Helps smart computers learn better without forgetting.