Score: 1

Information Theoretic Learning for Diffusion Models with Warm Start

Published: October 23, 2025 | arXiv ID: 2510.20903v1

By: Yirong Shen, Lu Gan, Cong Ling

Potential Business Impact:

Makes AI learn faster and better from messy data.

Business Areas:
Intrusion Detection Information Technology, Privacy and Security

Generative models that maximize model likelihood have gained traction in many practical settings. Among them, perturbation based approaches underpin many strong likelihood estimation models, yet they often face slow convergence and limited theoretical understanding. In this paper, we derive a tighter likelihood bound for noise driven models to improve both the accuracy and efficiency of maximum likelihood learning. Our key insight extends the classical KL divergence Fisher information relationship to arbitrary noise perturbations, going beyond the Gaussian assumption and enabling structured noise distributions. This formulation allows flexible use of randomized noise distributions that naturally account for sensor artifacts, quantization effects, and data distribution smoothing, while remaining compatible with standard diffusion training. Treating the diffusion process as a Gaussian channel, we further express the mismatched entropy between data and model, showing that the proposed objective upper bounds the negative log-likelihood (NLL). In experiments, our models achieve competitive NLL on CIFAR-10 and SOTA results on ImageNet across multiple resolutions, all without data augmentation, and the framework extends naturally to discrete data.

Country of Origin
🇬🇧 United Kingdom

Page Count
52 pages

Category
Computer Science:
Information Theory