Debiasing Kernel-Based Generative Models
By: Tian Qin, Wei-Min Huang
Potential Business Impact:
Makes computer pictures look clearer and sharper.
We propose a novel two-stage framework of generative models named Debiasing Kernel-Based Generative Models (DKGM) with the insights from kernel density estimation (KDE) and stochastic approximation. In the first stage of DKGM, we employ KDE to bypass the obstacles in estimating the density of data without losing too much image quality. One characteristic of KDE is oversmoothing, which makes the generated image blurry. Therefore, in the second stage, we formulate the process of reducing the blurriness of images as a statistical debiasing problem and develop a novel iterative algorithm to improve image quality, which is inspired by the stochastic approximation. Extensive experiments illustrate that the image quality of DKGM on CIFAR10 is comparable to state-of-the-art models such as diffusion models and GAN models. The performance of DKGM on CelebA 128x128 and LSUN (Church) 128x128 is also competitive. We conduct extra experiments to exploit how the bandwidth in KDE affects the sample diversity and debiasing effect of DKGM. The connections between DKGM and score-based models are also discussed.
Similar Papers
SD-KDE: Score-Debiased Kernel Density Estimation
Machine Learning (CS)
Improves how computers guess data patterns.
Correcting Mode Proportion Bias in Generalized Bayesian Inference via a Weighted Kernel Stein Discrepancy
Machine Learning (CS)
Helps computers find all answers, even tricky ones.
CKGAN: Training Generative Adversarial Networks Using Characteristic Kernel Integral Probability Metrics
Machine Learning (CS)
Makes fake pictures look more real.