On the randomized SVD in infinite dimensions
By: Daniel Kressner, David Persson, André Uschmajew
Potential Business Impact:
Makes big math problems faster and more accurate.
Randomized methods, such as the randomized SVD (singular value decomposition) and Nystr\"om approximation, are an effective way to compute low-rank approximations of large matrices. Motivated by applications to operator learning, Boull\'e and Townsend (FoCM, 2023) recently proposed an infinite-dimensional extension of the randomized SVD for a Hilbert--Schmidt operator $A$ that invokes randomness through a Gaussian process with a covariance operator $K$. While the non-isotropy introduced by $K$ allows one to incorporate prior information on $A$, an unfortunate choice may lead to unfavorable performance and large constants in the error bounds. In this work, we introduce a novel infinite-dimensional extension of the randomized SVD that does not require such a choice and enjoys error bounds that match those for the finite-dimensional case. Moreover, it reflects the common practice of using the randomized SVD with isotropic random vectors, also when approximating discretized operators. In fact, the theoretical results of this work show how the usual randomized SVD applied to a discretization of $A$ approaches our infinite-dimensional extension as the discretization gets refined, both in terms of error bounds and the Wasserstein distance. We also present and analyze a novel extension of the Nystr\"om approximation for self-adjoint positive semi-definite trace class operators.
Similar Papers
Randomized block Krylov method for approximation of truncated tensor SVD
Numerical Analysis
Makes big data smaller for computers.
A Precise Performance Analysis of the Randomized Singular Value Decomposition
Numerical Analysis
Makes computer math faster for big data.
Randomized Quantum Singular Value Transformation
Quantum Physics
Makes quantum computers solve problems faster and easier.