Generalized Orthogonal Approximate Message-Passing for Sublinear Sparsity
By: Keigo Takeuchi
Potential Business Impact:
Fixes computer guessing of hidden information faster.
This paper addresses the reconstruction of sparse signals from generalized linear measurements. Signal sparsity is assumed to be sublinear in the signal dimension while it was proportional to the signal dimension in conventional research. Approximate message-passing (AMP) has poor convergence properties for sensing matrices beyond standard Gaussian matrices. To solve this convergence issue, generalized orthogonal AMP (GOAMP) is proposed for signals with sublinear sparsity. The main feature of GOAMP is the so-called Onsager correction to realize asymptotic Gaussianity of estimation errors. The Onsager correction in GOAMP is designed via state evolution for orthogonally invariant sensing matrices in the sublinear sparsity limit, where the signal sparsity and measurement dimension tend to infinity at sublinear speed in the signal dimension. When the support of non-zero signals does not contain a neighborhood of the origin, GOAMP using Bayesian denoisers is proved to achieve error-free signal reconstruction for linear measurements if and only if the measurement dimension is larger than a threshold, which is equal to that of AMP for standard Gaussian sensing matrices. Numerical simulations are also presented for linear measurements and 1-bit compressed sensing. When ill-conditioned sensing matrices are used, GOAMP for sublinear sparsity is shown to outperform existing reconstruction algorithms, including generalized AMP for sublinear sparsity.
Similar Papers
Markov Chains Approximate Message Passing
Data Structures and Algorithms
Helps computers find hidden patterns in messy data.
Markov Chains Approximate Message Passing
Data Structures and Algorithms
Helps computers find hidden patterns in noisy data.
Dimension-Free Bounds for Generalized First-Order Methods via Gaussian Coupling
Machine Learning (Stat)
Makes computer learning faster and more accurate.