Generative diffusion posterior sampling for informative likelihoods
By: Zheng Zhao
Potential Business Impact:
Makes AI create better pictures from less data.
Sequential Monte Carlo (SMC) methods have recently shown successful results for conditional sampling of generative diffusion models. In this paper we propose a new diffusion posterior SMC sampler achieving improved statistical efficiencies, particularly under outlier conditions or highly informative likelihoods. The key idea is to construct an observation path that correlates with the diffusion model and to design the sampler to leverage this correlation for more efficient sampling. Empirical results conclude the efficiency.
Similar Papers
Reverse Diffusion Sequential Monte Carlo Samplers
Computation
Fixes computer art errors for better pictures.
Sequential Monte Carlo with Gaussian Mixture Approximation for Infinite-Dimensional Statistical Inverse Problems
Numerical Analysis
Finds hidden patterns in complex data faster.
Split Gibbs Discrete Diffusion Posterior Sampling
Machine Learning (CS)
Creates new DNA, images, and music from scratch.