Score: 0

Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators

Published: March 14, 2025 | arXiv ID: 2503.11029v1

By: Weiye Gan , Yicheng Li , Qian Lin and more

Potential Business Impact:

Trains computers to learn physics faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Spectral bias is a significant phenomenon in neural network training and can be explained by neural tangent kernel (NTK) theory. In this work, we develop the NTK theory for deep neural networks with physics-informed loss, providing insights into the convergence of NTK during initialization and training, and revealing its explicit structure. We find that, in most cases, the differential operators in the loss function do not induce a faster eigenvalue decay rate and stronger spectral bias. Some experimental results are also presented to verify the theory.

Country of Origin
🇨🇳 China

Page Count
35 pages

Category
Computer Science:
Machine Learning (CS)