NTK-Guided Implicit Neural Teaching
By: Chen Zhang , Wei Zuo , Bingyang Cheng and more
Potential Business Impact:
Teaches computers to learn faster from images.
Implicit Neural Representations (INRs) parameterize continuous signals via multilayer perceptrons (MLPs), enabling compact, resolution-independent modeling for tasks like image, audio, and 3D reconstruction. However, fitting high-resolution signals demands optimizing over millions of coordinates, incurring prohibitive computational costs. To address it, we propose NTK-Guided Implicit Neural Teaching (NINT), which accelerates training by dynamically selecting coordinates that maximize global functional updates. Leveraging the Neural Tangent Kernel (NTK), NINT scores examples by the norm of their NTK-augmented loss gradients, capturing both fitting errors and heterogeneous leverage (self-influence and cross-coordinate coupling). This dual consideration enables faster convergence compared to existing methods. Through extensive experiments, we demonstrate that NINT significantly reduces training time by nearly half while maintaining or improving representation quality, establishing state-of-the-art acceleration among recent sampling-based strategies.
Similar Papers
Understanding NTK Variance in Implicit Neural Representations
Machine Learning (CS)
Makes AI learn faster and better.
Scaling Implicit Fields via Hypernetwork-Driven Multiscale Coordinate Transformations
Artificial Intelligence
Makes computer pictures clearer with less data.
Optimizing Rank for High-Fidelity Implicit Neural Representations
CV and Pattern Recognition
Makes simple computer brains show sharp, detailed pictures.