A Precise Performance Analysis of the Randomized Singular Value Decomposition
By: Danil Akhtiamov, Reza Ghane, Babak Hassibi
Potential Business Impact:
Makes computer math faster for big data.
The Randomized Singular Value Decomposition (RSVD) is a widely used algorithm for efficiently computing low-rank approximations of large matrices, without the need to construct a full-blown SVD. Of interest, of course, is the approximation error of RSVD compared to the optimal low-rank approximation error obtained from the SVD. While the literature provides various upper and lower error bounds for RSVD, in this paper we derive precise asymptotic expressions that characterize its approximation error as the matrix dimensions grow to infinity. Our expressions depend only on the singular values of the matrix, and we evaluate them for two important matrix ensembles: those with power law and bilevel singular value distributions. Our results aim to quantify the gap between the existing theoretical bounds and the actual performance of RSVD. Furthermore, we extend our analysis to polynomial-filtered RSVD, deriving performance characterizations that provide insights into optimal filter selection.
Similar Papers
Low-Rank Matrix Approximation for Neural Network Compression
Machine Learning (CS)
Makes smart computer programs run faster and smaller.
On the randomized SVD in infinite dimensions
Numerical Analysis
Makes big math problems faster and more accurate.
Efficient randomized algorithms for the fixed Tucker-rank problem of Tucker decomposition with adaptive shifts
Numerical Analysis
Makes big data problems solve much faster.