Score: 2

Random feature approximation for general spectral methods

Published: June 19, 2025 | arXiv ID: 2506.16283v1

By: Mike Nguyen, Nicole Mücke

Potential Business Impact:

Makes AI learn better and faster.

Business Areas:
A/B Testing Data and Analytics

Random feature approximation is arguably one of the most widely used techniques for kernel methods in large-scale learning algorithms. In this work, we analyze the generalization properties of random feature methods, extending previous results for Tikhonov regularization to a broad class of spectral regularization techniques. This includes not only explicit methods but also implicit schemes such as gradient descent and accelerated algorithms like the Heavy-Ball and Nesterov method. Through this framework, we enable a theoretical analysis of neural networks and neural operators through the lens of the Neural Tangent Kernel (NTK) approach trained via gradient descent. For our estimators we obtain optimal learning rates over regularity classes (even for classes that are not included in the reproducing kernel Hilbert space), which are defined through appropriate source conditions. This improves or completes previous results obtained in related settings for specific kernel algorithms.

Country of Origin
🇩🇪 Germany

Repos / Data Links

Page Count
39 pages

Category
Statistics:
Machine Learning (Stat)