Score: 1

NTK-Guided Implicit Neural Teaching

Published: November 19, 2025 | arXiv ID: 2511.15487v1

By: Chen Zhang , Wei Zuo , Bingyang Cheng and more

Potential Business Impact:

Teaches computers to learn faster from images.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Implicit Neural Representations (INRs) parameterize continuous signals via multilayer perceptrons (MLPs), enabling compact, resolution-independent modeling for tasks like image, audio, and 3D reconstruction. However, fitting high-resolution signals demands optimizing over millions of coordinates, incurring prohibitive computational costs. To address it, we propose NTK-Guided Implicit Neural Teaching (NINT), which accelerates training by dynamically selecting coordinates that maximize global functional updates. Leveraging the Neural Tangent Kernel (NTK), NINT scores examples by the norm of their NTK-augmented loss gradients, capturing both fitting errors and heterogeneous leverage (self-influence and cross-coordinate coupling). This dual consideration enables faster convergence compared to existing methods. Through extensive experiments, we demonstrate that NINT significantly reduces training time by nearly half while maintaining or improving representation quality, establishing state-of-the-art acceleration among recent sampling-based strategies.

Country of Origin
🇭🇰 Hong Kong

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)