Markov Chains Approximate Message Passing
By: Amit Rajaraman, David X. Wu
Potential Business Impact:
Helps computers find hidden patterns in noisy data.
Markov chain Monte Carlo algorithms have long been observed to obtain near-optimal performance in various Bayesian inference settings. However, developing a supporting theory that make these studies rigorous has proved challenging. In this paper, we study the classical spiked Wigner inference problem, where one aims to recover a planted Boolean spike from a noisy matrix measurement. We relate the recovery performance of Glauber dynamics on the annealed posterior to the performance of Approximate Message Passing (AMP), which is known to achieve Bayes-optimal performance. Our main results rely on the analysis of an auxiliary Markov chain called restricted Gaussian dynamics (RGD). Concretely, we establish the following results: 1. RGD can be reduced to an effective one-dimensional recursion which mirrors the evolution of the AMP iterates. 2. From a warm start, RGD rapidly converges to a fixed point in correlation space, which recovers Bayes-optimal performance when run on the posterior. 3. Conditioned on widely believed mixing results for the SK model, we recover the phase transition for non-trivial inference.
Similar Papers
Markov Chains Approximate Message Passing
Data Structures and Algorithms
Helps computers find hidden patterns in messy data.
Generalized Orthogonal Approximate Message-Passing for Sublinear Sparsity
Information Theory
Fixes computer guessing of hidden information faster.
Dimension-Free Bounds for Generalized First-Order Methods via Gaussian Coupling
Machine Learning (Stat)
Makes computer learning faster and more accurate.