Information Theoretic Learning for Diffusion Models with Warm Start
By: Yirong Shen, Lu Gan, Cong Ling
Potential Business Impact:
Makes AI learn faster and better from messy data.
Generative models that maximize model likelihood have gained traction in many practical settings. Among them, perturbation based approaches underpin many strong likelihood estimation models, yet they often face slow convergence and limited theoretical understanding. In this paper, we derive a tighter likelihood bound for noise driven models to improve both the accuracy and efficiency of maximum likelihood learning. Our key insight extends the classical KL divergence Fisher information relationship to arbitrary noise perturbations, going beyond the Gaussian assumption and enabling structured noise distributions. This formulation allows flexible use of randomized noise distributions that naturally account for sensor artifacts, quantization effects, and data distribution smoothing, while remaining compatible with standard diffusion training. Treating the diffusion process as a Gaussian channel, we further express the mismatched entropy between data and model, showing that the proposed objective upper bounds the negative log-likelihood (NLL). In experiments, our models achieve competitive NLL on CIFAR-10 and SOTA results on ImageNet across multiple resolutions, all without data augmentation, and the framework extends naturally to discrete data.
Similar Papers
The Information Dynamics of Generative Diffusion
Machine Learning (Stat)
Makes AI create better pictures by controlling noise.
The Information Dynamics of Generative Diffusion
Machine Learning (Stat)
Makes AI create new things by breaking symmetries.
The Information Dynamics of Generative Diffusion
Machine Learning (Stat)
Makes AI create new things by breaking symmetries.