Score: 0

Approximation Capabilities of Feedforward Neural Networks with GELU Activations

Published: December 25, 2025 | arXiv ID: 2512.21749v1

By: Konstantin Yakovlev, Nikita Puchkin

We derive an approximation error bound that holds simultaneously for a function and all its derivatives up to any prescribed order. The bounds apply to elementary functions, including multivariate polynomials, the exponential function, and the reciprocal function, and are obtained using feedforward neural networks with the Gaussian Error Linear Unit (GELU) activation. In addition, we report the network size, weight magnitudes, and behavior at infinity. Our analysis begins with a constructive approximation of multiplication, where we prove the simultaneous validity of error bounds over domains of increasing size for a given approximator. Leveraging this result, we obtain approximation guarantees for division and the exponential function, ensuring that all higher-order derivatives of the resulting approximators remain globally bounded.

Category
Computer Science:
Machine Learning (CS)