STNet: Spectral Transformation Network for Solving Operator Eigenvalue Problem
By: Hong Wang , Jiang Yixuan , Jie Wang and more
Potential Business Impact:
Solves hard math problems faster with smart computer tricks.
Operator eigenvalue problems play a critical role in various scientific fields and engineering applications, yet numerical methods are hindered by the curse of dimensionality. Recent deep learning methods provide an efficient approach to address this challenge by iteratively updating neural networks. These methods' performance relies heavily on the spectral distribution of the given operator: larger gaps between the operator's eigenvalues will improve precision, thus tailored spectral transformations that leverage the spectral distribution can enhance their performance. Based on this observation, we propose the Spectral Transformation Network (STNet). During each iteration, STNet uses approximate eigenvalues and eigenfunctions to perform spectral transformations on the original operator, turning it into an equivalent but easier problem. Specifically, we employ deflation projection to exclude the subspace corresponding to already solved eigenfunctions, thereby reducing the search space and avoiding converging to existing eigenfunctions. Additionally, our filter transform magnifies eigenvalues in the desired region and suppresses those outside, further improving performance. Extensive experiments demonstrate that STNet consistently outperforms existing learning-based methods, achieving state-of-the-art performance in accuracy.
Similar Papers
When do spectral gradient updates help in deep learning?
Machine Learning (CS)
Makes AI learn faster by changing how it trains.
The Operator Origins of Neural Scaling Laws: A Generalized Spectral Transport Dynamics of Deep Learning
Machine Learning (CS)
Makes AI learn faster and better.
Accelerating Eigenvalue Dataset Generation via Chebyshev Subspace Filter
Machine Learning (CS)
Finds math answers much faster for computers.