Randomized and quantum approximate matrix multiplication
By: Simon Apers, Arjan Cornelissen, Samson Wang
Potential Business Impact:
Makes computers multiply big numbers much faster.
The complexity of matrix multiplication is a central topic in computer science. While the focus has traditionally been on exact algorithms, a long line of literature also considers randomized algorithms, which return an approximate solution in faster time. In this work, we adopt a unifying perspective that frames these randomized algorithms in terms of mean estimation. Using it, we first give refined analyses of classical algorithms based on random walks by Cohen-Lewis (`99), and based on sketching by Sarl\'os (`06) and Drineas-Kannan-Mahoney (`06). We then propose an improvement on Cohen-Lewis that yields a single classical algorithm that is faster than all the other approaches, if we assume no use of (exact) fast matrix multiplication as a subroutine. Second, we demonstrate a quantum speedup on top of these algorithms by using the recent quantum multivariate mean estimation algorithm by Cornelissen-Hamoudi-Jerbi (`22).
Similar Papers
Quantum Worst-Case to Average-Case Reduction for Matrix-Vector Multiplication
Quantum Physics
Makes quantum computers solve harder problems faster.
(Approximate) Matrix Multiplication via Convolutions
Data Structures and Algorithms
Makes computers multiply big numbers much faster.
A Note on Fine-Grained Quantum Reductions for Linear Algebraic Problems
Data Structures and Algorithms
Makes computers multiply big numbers much faster.