General transformation neural networks: A class of parametrized functions for high-dimensional function approximation
By: Xiaoyang Wang, Yiqi Gu
Potential Business Impact:
Makes computers better at guessing tricky math problems.
We propose a novel class of neural network-like parametrized functions, i.e., general transformation neural networks (GTNNs), for high-dimensional approximation. Conventional deep neural networks sometimes perform less accurately in approximation problems under gradient descent training, especially when the target function is oscillatory. To improve accuracy, we generalize the affine transformation of the abstract neuron to more general functions, which act as complex shape functions and have larger capacities. Specifically, we introduce two types of GTNNs: the cubic and quadratic transformation neural networks (CTNNs and QTNNs). We perform approximation error analysis for CTNNs and QTNNs, presenting their universal approximation properties for continuous functions and error bounds for smooth functions and Barron-type functions. Several numerical examples of regression problems and partial differential equations are presented, demonstrating that CTNNs/QTNNs have advantages in accuracy and robustness over conventional fully connected neural networks.
Similar Papers
General Transform: A Unified Framework for Adaptive Transform to Enhance Representations
Machine Learning (CS)
**Computer models learn better by adapting how they see data.**
Genetic Transformer-Assisted Quantum Neural Networks for Optimal Circuit Design
Quantum Physics
Makes quantum computers learn better with less effort.
A framework of discontinuous Galerkin neural networks for iteratively approximating residuals
Numerical Analysis
Makes computer models solve math problems faster.