Low-rank approximation of analytic kernels
By: Marcus Webb
Potential Business Impact:
Makes computer math faster for science.
Many algorithms in scientific computing and data science take advantage of low-rank approximation of matrices and kernels, and understanding why nearly-low-rank structure occurs is essential for their analysis and further development. This paper provides a framework for bounding the best low-rank approximation error of matrices arising from samples of a kernel that is analytically continuable in one of its variables to an open region of the complex plane. Elegantly, the low-rank approximations used in the proof are computable by rational interpolation using the roots and poles of Zolotarev rational functions, leading to a fast algorithm for their construction.
Similar Papers
Rank of Matrices Arising out of Singular Kernel Functions
Numerical Analysis
Helps computers understand complex data better.
A general technique for approximating high-dimensional empirical kernel matrices
Machine Learning (Stat)
Makes computer predictions more accurate for complex data.
Matrices over a Hilbert space and their low-rank approximation
Numerical Analysis
Makes computers solve hard math problems faster.