Score: 0

Approximation Rates of Shallow Neural Networks: Barron Spaces, Activation Functions and Optimality Analysis

Published: October 21, 2025 | arXiv ID: 2510.18388v1

By: Jian Lu, Xiaohuang Huang

Potential Business Impact:

Makes AI learn better with fewer steps.

Business Areas:
A/B Testing Data and Analytics

This paper investigates the approximation properties of shallow neural networks with activation functions that are powers of exponential functions. It focuses on the dependence of the approximation rate on the dimension and the smoothness of the function being approximated within the Barron function space. We examine the approximation rates of ReLU$^{k}$ activation functions, proving that the optimal rate cannot be achieved under $\ell^{1}$-bounded coefficients or insufficient smoothness conditions. We also establish optimal approximation rates in various norms for functions in Barron spaces and Sobolev spaces, confirming the curse of dimensionality. Our results clarify the limits of shallow neural networks' approximation capabilities and offer insights into the selection of activation functions and network structures.

Country of Origin
🇨🇳 China

Page Count
29 pages

Category
Computer Science:
Machine Learning (CS)