Score: 0

The stability of shallow neural networks on spheres: A sharp spectral analysis

Published: November 4, 2025 | arXiv ID: 2511.02625v1

By: Xinliang Liu, Tong Mao, Jinchao Xu

Potential Business Impact:

Makes AI learn better and work more reliably.

Business Areas:
Analytics Data and Analytics

We present an estimation of the condition numbers of the \emph{mass} and \emph{stiffness} matrices arising from shallow ReLU$^k$ neural networks defined on the unit sphere~$\mathbb{S}^d$. In particular, when $\{\theta_j^*\}_{j=1}^n \subset \mathbb{S}^d$ is \emph{antipodally quasi-uniform}, the condition number is sharp. Indeed, in this case, we obtain sharp asymptotic estimates for the full spectrum of eigenvalues and characterize the structure of the corresponding eigenspaces, showing that the smallest eigenvalues are associated with an eigenbasis of low-degree polynomials while the largest eigenvalues are linked to high-degree polynomials. This spectral analysis establishes a precise correspondence between the approximation power of the network and its numerical stability.

Page Count
30 pages

Category
Mathematics:
Numerical Analysis (Math)