Score: 0

Mathematical Foundations of Neural Tangents and Infinite-Width Networks

Published: December 9, 2025 | arXiv ID: 2512.08264v1

By: Rachana Mysore , Preksha Girish , Kavitha Jayaram and more

Potential Business Impact:

Makes AI learn better and faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We investigate the mathematical foundations of neural networks in the infinite-width regime through the Neural Tangent Kernel (NTK). We propose the NTK-Eigenvalue-Controlled Residual Network (NTK-ECRN), an architecture integrating Fourier feature embeddings, residual connections with layerwise scaling, and stochastic depth to enable rigorous analysis of kernel evolution during training. Our theoretical contributions include deriving bounds on NTK dynamics, characterizing eigenvalue evolution, and linking spectral properties to generalization and optimization stability. Empirical results on synthetic and benchmark datasets validate the predicted kernel behavior and demonstrate improved training stability and generalization. This work provides a comprehensive framework bridging infinite-width theory and practical deep-learning architectures.

Page Count
7 pages

Category
Computer Science:
Machine Learning (CS)