Score: 0

New Approximation Results and Optimal Estimation for Fully Connected Deep Neural Networks

Published: December 10, 2025 | arXiv ID: 2512.09853v1

By: Zhaoji Tang

Potential Business Impact:

Makes smart computer programs learn faster and better.

Business Areas:
Darknet Internet Services

\citet{farrell2021deep} establish non-asymptotic high-probability bounds for general deep feedforward neural network (with rectified linear unit activation function) estimators, with \citet[Theorem 1]{farrell2021deep} achieving a suboptimal convergence rate for fully connected feedforward networks. The authors suggest that improved approximation of fully connected networks could yield sharper versions of \citet[Theorem 1]{farrell2021deep} without altering the theoretical framework. By deriving approximation bounds specifically for a narrower fully connected deep neural network, this note demonstrates that \citet[Theorem 1]{farrell2021deep} can be improved to achieve an optimal rate (up to a logarithmic factor). Furthermore, this note briefly shows that deep neural network estimators can mitigate the curse of dimensionality for functions with compositional structure and functions defined on manifolds.

Country of Origin
🇬🇧 United Kingdom

Page Count
30 pages

Category
Economics:
Econometrics