Posterior Sampling by Combining Diffusion Models with Annealed Langevin Dynamics
By: Zhiyang Xun, Shivam Gupta, Eric Price
Potential Business Impact:
Lets computers guess hidden pictures from fuzzy data.
Given a noisy linear measurement $y = Ax + ξ$ of a distribution $p(x)$, and a good approximation to the prior $p(x)$, when can we sample from the posterior $p(x \mid y)$? Posterior sampling provides an accurate and fair framework for tasks such as inpainting, deblurring, and MRI reconstruction, and several heuristics attempt to approximate it. Unfortunately, approximate posterior sampling is computationally intractable in general. To sidestep this hardness, we focus on (local or global) log-concave distributions $p(x)$. In this regime, Langevin dynamics yields posterior samples when the exact scores of $p(x)$ are available, but it is brittle to score--estimation error, requiring an MGF bound (sub-exponential error). By contrast, in the unconditional setting, diffusion models succeed with only an $L^2$ bound on the score error. We prove that combining diffusion models with an annealed variant of Langevin dynamics achieves conditional sampling in polynomial time using merely an $L^4$ bound on the score error.
Similar Papers
Posterior Sampling by Combining Diffusion Models with Annealed Langevin Dynamics
Machine Learning (CS)
Makes blurry pictures clear with less math.
Efficient Approximate Posterior Sampling with Annealed Langevin Monte Carlo
Machine Learning (CS)
Makes AI create realistic images from messy data.
Diffusion annealed Langevin dynamics: a theoretical study
Probability
Makes AI create realistic images and sounds.