Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators
By: Weiye Gan , Yicheng Li , Qian Lin and more
Potential Business Impact:
Trains computers to learn physics faster.
Spectral bias is a significant phenomenon in neural network training and can be explained by neural tangent kernel (NTK) theory. In this work, we develop the NTK theory for deep neural networks with physics-informed loss, providing insights into the convergence of NTK during initialization and training, and revealing its explicit structure. We find that, in most cases, the differential operators in the loss function do not induce a faster eigenvalue decay rate and stronger spectral bias. Some experimental results are also presented to verify the theory.
Similar Papers
Neural Tangent Kernel Analysis to Probe Convergence in Physics-informed Neural Solvers: PIKANs vs. PINNs
Machine Learning (CS)
Helps computers solve hard math problems faster.
Divergence of Empirical Neural Tangent Kernel in Classification Problems
Machine Learning (CS)
Shows how some computer brains learn differently than expected.
Convergence and Sketching-Based Efficient Computation of Neural Tangent Kernel Weights in Physics-Based Loss
Numerical Analysis
Makes AI learn faster and better by balancing goals.