One-shot Conditional Sampling: MMD meets Nearest Neighbors
By: Anirban Chatterjee, Sayantan Choudhury, Rohan Hore
Potential Business Impact:
Makes computers create better pictures from less info.
How can we generate samples from a conditional distribution that we never fully observe? This question arises across a broad range of applications in both modern machine learning and classical statistics, including image post-processing in computer vision, approximate posterior sampling in simulation-based inference, and conditional distribution modeling in complex data settings. In such settings, compared with unconditional sampling, additional feature information can be leveraged to enable more adaptive and efficient sampling. Building on this, we introduce Conditional Generator using MMD (CGMMD), a novel framework for conditional sampling. Unlike many contemporary approaches, our method frames the training objective as a simple, adversary-free direct minimization problem. A key feature of CGMMD is its ability to produce conditional samples in a single forward pass of the generator, enabling practical one-shot sampling with low test-time complexity. We establish rigorous theoretical bounds on the loss incurred when sampling from the CGMMD sampler, and prove convergence of the estimated distribution to the true conditional distribution. In the process, we also develop a uniform concentration result for nearest-neighbor based functionals, which may be of independent interest. Finally, we show that CGMMD performs competitively on synthetic tasks involving complex conditional densities, as well as on practical applications such as image denoising and image super-resolution.
Similar Papers
Adaptive generative moment matching networks for improved learning of dependence structures
Machine Learning (Stat)
Makes computer-made numbers more realistic.
Imbalance-Robust and Sampling-Efficient Continuous Conditional GANs via Adaptive Vicinity and Auxiliary Regularization
Machine Learning (CS)
Makes AI create realistic images faster.
Multi-fidelity Parameter Estimation Using Conditional Diffusion Models
Machine Learning (CS)
Makes computer models guess better and faster.