Smoothed analysis in compressed sensing
By: Elad Aigner-Horev, Dan Hefetz, Michael Trushkin
Potential Business Impact:
Makes computers find hidden patterns in messy data.
Arbitrary matrices $M \in \mathbb{R}^{m \times n}$, randomly perturbed in an additive manner using a random matrix $R \in \mathbb{R}^{m \times n}$, are shown to asymptotically almost surely satisfy the so-called {\sl robust null space property}. Whilst insisting on an asymptotically optimal order of magnitude for $m$ required to attain {\sl unique reconstruction} via $\ell_1$-minimisation algorithms, our results track the level of arbitrariness allowed for the fixed seed matrix $M$ as well as the degree of distributional irregularity allowed for the entries of the perturbing matrix $R$. Starting with sub-gaussian entries for $R$, our results culminate with these allowed to have substantially heavier tails than sub-exponential ones. Throughout this trajectory, two measures control the arbitrariness allowed for $M$; the first is $\|M\|_\infty$ and the second is a localised notion of the Frobenius norm of $M$ (which depends on the sparsity of the signal being reconstructed). A key tool driving our proofs is {\sl Mendelson's small-ball method} ({\em Learning without concentration}, J. ACM, Vol. $62$, $2015$).
Similar Papers
Randomized strong rank-revealing QR for column subset selection and low-rank matrix approximation
Numerical Analysis
Finds important patterns in complex data.
Fast exact recovery of noisy matrix from few entries: the infinity norm approach
Statistics Theory
Rebuilds missing data even when it's a little messy.
Structured Approximation of Toeplitz Matrices and Subspaces
Information Theory
Fixes broken data using math tricks.