Discontinuous hybrid neural networks for the one-dimensional partial differential equations
By: Xiaoyu Wang, Long Yuan, Yao Yu
Potential Business Impact:
Solves hard math problems with smart computer programs.
A feedforward neural network, including hidden layers, motivated by nonlinear functions (such as Tanh, ReLU, and Sigmoid functions), exhibits uniform approximation properties in Sobolev space, and discontinuous neural networks can reduce computational complexity. In this work, we present a discontinuous hybrid neural network method for solving the partial differential equations, construct a new hybrid loss functional that incorporates the variational of the approximation equation, interface jump stencil and boundary constraints. The RMSprop algorithm and discontinuous Galerkin method are employed to update the nonlinear parameters and linear parameters in neural networks, respectively. This approach guarantees the convergence of the loss functional and provides an approximate solution with high accuracy.
Similar Papers
Neural Network Element Method for Partial Differential Equations
Numerical Analysis
Solves hard math problems for engineers.
A Hybrid Discontinuous Galerkin Neural Network Method for Solving Hyperbolic Conservation Laws with Temporal Progressive Learning
Numerical Analysis
Helps computers solve tricky math problems better.
Approximation properties of neural ODEs
Numerical Analysis
Makes smart computer programs learn better.