Score: 0

Nonlocal Neural Tangent Kernels via Parameter-Space Interactions

Published: September 15, 2025 | arXiv ID: 2509.12467v1

By: Sriram Nagaraj, Vishakh Hari

Potential Business Impact:

Helps computers learn from messy, imperfect data.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

The Neural Tangent Kernel (NTK) framework has provided deep insights into the training dynamics of neural networks under gradient flow. However, it relies on the assumption that the network is differentiable with respect to its parameters, an assumption that breaks down when considering non-smooth target functions or parameterized models exhibiting non-differentiable behavior. In this work, we propose a Nonlocal Neural Tangent Kernel (NNTK) that replaces the local gradient with a nonlocal interaction-based approximation in parameter space. Nonlocal gradients are known to exist for a wider class of functions than the standard gradient. This allows NTK theory to be extended to nonsmooth functions, stochastic estimators, and broader families of models. We explore both fixed-kernel and attention-based formulations of this nonlocal operator. We illustrate the new formulation with numerical studies.

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)