Score: 0

Posterior Sampling by Combining Diffusion Models with Annealed Langevin Dynamics

Published: October 30, 2025 | arXiv ID: 2510.26324v2

By: Zhiyang Xun, Shivam Gupta, Eric Price

Potential Business Impact:

Lets computers guess hidden pictures from fuzzy data.

Business Areas:
A/B Testing Data and Analytics

Given a noisy linear measurement $y = Ax + ξ$ of a distribution $p(x)$, and a good approximation to the prior $p(x)$, when can we sample from the posterior $p(x \mid y)$? Posterior sampling provides an accurate and fair framework for tasks such as inpainting, deblurring, and MRI reconstruction, and several heuristics attempt to approximate it. Unfortunately, approximate posterior sampling is computationally intractable in general. To sidestep this hardness, we focus on (local or global) log-concave distributions $p(x)$. In this regime, Langevin dynamics yields posterior samples when the exact scores of $p(x)$ are available, but it is brittle to score--estimation error, requiring an MGF bound (sub-exponential error). By contrast, in the unconditional setting, diffusion models succeed with only an $L^2$ bound on the score error. We prove that combining diffusion models with an annealed variant of Langevin dynamics achieves conditional sampling in polynomial time using merely an $L^4$ bound on the score error.

Country of Origin
🇺🇸 United States

Page Count
51 pages

Category
Computer Science:
Machine Learning (CS)