The stability of shallow neural networks on spheres: A sharp spectral analysis
By: Xinliang Liu, Tong Mao, Jinchao Xu
Potential Business Impact:
Makes AI learn better and work more reliably.
We present an estimation of the condition numbers of the \emph{mass} and \emph{stiffness} matrices arising from shallow ReLU$^k$ neural networks defined on the unit sphere~$\mathbb{S}^d$. In particular, when $\{\theta_j^*\}_{j=1}^n \subset \mathbb{S}^d$ is \emph{antipodally quasi-uniform}, the condition number is sharp. Indeed, in this case, we obtain sharp asymptotic estimates for the full spectrum of eigenvalues and characterize the structure of the corresponding eigenspaces, showing that the smallest eigenvalues are associated with an eigenbasis of low-degree polynomials while the largest eigenvalues are linked to high-degree polynomials. This spectral analysis establishes a precise correspondence between the approximation power of the network and its numerical stability.
Similar Papers
Condition Numbers and Eigenvalue Spectra of Shallow Networks on Spheres
Numerical Analysis
Makes AI smarter and more stable.
Sharp Lower Bounds for Linearized ReLU^k Approximation on the Sphere
Numerical Analysis
Shows how fast computers learn with certain math.
On the Stability of the Jacobian Matrix in Deep Neural Networks
Machine Learning (CS)
Makes smart computer programs learn better.