Score: 2

Markov Chains Approximate Message Passing

Published: December 2, 2025 | arXiv ID: 2512.02384v1

By: Amit Rajaraman, David X. Wu

BigTech Affiliations: Massachusetts Institute of Technology University of California, Berkeley

Potential Business Impact:

Helps computers find hidden patterns in noisy data.

Business Areas:
A/B Testing Data and Analytics

Markov chain Monte Carlo algorithms have long been observed to obtain near-optimal performance in various Bayesian inference settings. However, developing a supporting theory that make these studies rigorous has proved challenging. In this paper, we study the classical spiked Wigner inference problem, where one aims to recover a planted Boolean spike from a noisy matrix measurement. We relate the recovery performance of Glauber dynamics on the annealed posterior to the performance of Approximate Message Passing (AMP), which is known to achieve Bayes-optimal performance. Our main results rely on the analysis of an auxiliary Markov chain called restricted Gaussian dynamics (RGD). Concretely, we establish the following results: 1. RGD can be reduced to an effective one-dimensional recursion which mirrors the evolution of the AMP iterates. 2. From a warm start, RGD rapidly converges to a fixed point in correlation space, which recovers Bayes-optimal performance when run on the posterior. 3. Conditioned on widely believed mixing results for the SK model, we recover the phase transition for non-trivial inference.

Country of Origin
🇺🇸 United States

Page Count
41 pages

Category
Computer Science:
Data Structures and Algorithms