Score: 0

General transformation neural networks: A class of parametrized functions for high-dimensional function approximation

Published: October 23, 2025 | arXiv ID: 2510.20142v1

By: Xiaoyang Wang, Yiqi Gu

Potential Business Impact:

Makes computers better at guessing tricky math problems.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We propose a novel class of neural network-like parametrized functions, i.e., general transformation neural networks (GTNNs), for high-dimensional approximation. Conventional deep neural networks sometimes perform less accurately in approximation problems under gradient descent training, especially when the target function is oscillatory. To improve accuracy, we generalize the affine transformation of the abstract neuron to more general functions, which act as complex shape functions and have larger capacities. Specifically, we introduce two types of GTNNs: the cubic and quadratic transformation neural networks (CTNNs and QTNNs). We perform approximation error analysis for CTNNs and QTNNs, presenting their universal approximation properties for continuous functions and error bounds for smooth functions and Barron-type functions. Several numerical examples of regression problems and partial differential equations are presented, demonstrating that CTNNs/QTNNs have advantages in accuracy and robustness over conventional fully connected neural networks.

Country of Origin
🇨🇳 China

Page Count
18 pages

Category
Mathematics:
Numerical Analysis (Math)