Functional tensor train neural network for solving high-dimensional PDEs
By: Yani Feng , Michael K. Ng , Kejun Tang and more
Potential Business Impact:
Solves hard math problems on weird shapes.
Discrete tensor train decomposition is widely employed to mitigate the curse of dimensionality in solving high-dimensional PDEs through traditional methods. However, the direct application of the tensor train method typically requires uniform grids of regular domains, which limits its application on non-uniform grids or irregular domains. To address the limitation, we develop a functional tensor train neural network (FTTNN) for solving high-dimensional PDEs, which can represent PDE solutions on non-uniform grids or irregular domains. An essential ingredient of our approach is to represent the PDE solutions by the functional tensor train format whose TT-core functions are approximated by neural networks. To give the functional tensor train representation, we propose and study functional tensor train rank and employ it into a physics-informed loss function for training. Because of tensor train representation, the resulting high-dimensional integral in the loss function can be computed via one-dimensional integrals by Gauss quadrature rules. Numerical examples including high-dimensional PDEs on regular or irregular domains are presented to demonstrate that the performance of the proposed FTTNN is better than that of Physics Informed Neural Networks (PINN).
Similar Papers
The low-rank tensor-train finite difference method for three-dimensional parabolic equations
Numerical Analysis
Makes complex computer problems faster and use less memory.
High order Tensor-Train-Based Schemes for High-Dimensional Mean Field Games
Numerical Analysis
Solves hard math problems for many players faster.
Interpolating Neural Network-Tensor Decomposition (INN-TD): a scalable and interpretable approach for large-scale physics-based problems
Computational Engineering, Finance, and Science
Makes computer models of science problems faster, smaller.