Score: 0

Functional tensor train neural network for solving high-dimensional PDEs

Published: October 15, 2025 | arXiv ID: 2510.13386v1

By: Yani Feng , Michael K. Ng , Kejun Tang and more

Potential Business Impact:

Solves hard math problems on weird shapes.

Business Areas:
Darknet Internet Services

Discrete tensor train decomposition is widely employed to mitigate the curse of dimensionality in solving high-dimensional PDEs through traditional methods. However, the direct application of the tensor train method typically requires uniform grids of regular domains, which limits its application on non-uniform grids or irregular domains. To address the limitation, we develop a functional tensor train neural network (FTTNN) for solving high-dimensional PDEs, which can represent PDE solutions on non-uniform grids or irregular domains. An essential ingredient of our approach is to represent the PDE solutions by the functional tensor train format whose TT-core functions are approximated by neural networks. To give the functional tensor train representation, we propose and study functional tensor train rank and employ it into a physics-informed loss function for training. Because of tensor train representation, the resulting high-dimensional integral in the loss function can be computed via one-dimensional integrals by Gauss quadrature rules. Numerical examples including high-dimensional PDEs on regular or irregular domains are presented to demonstrate that the performance of the proposed FTTNN is better than that of Physics Informed Neural Networks (PINN).

Country of Origin
🇭🇰 Hong Kong

Page Count
20 pages

Category
Mathematics:
Numerical Analysis (Math)