MMG: Mutual Information Estimation via the MMSE Gap in Diffusion
By: Longxuan Yu , Xing Shi , Xianghao Kong and more
Potential Business Impact:
Helps computers find hidden connections in data.
Mutual information (MI) is one of the most general ways to measure relationships between random variables, but estimating this quantity for complex systems is challenging. Denoising diffusion models have recently set a new bar for density estimation, so it is natural to consider whether these methods could also be used to improve MI estimation. Using the recently introduced information-theoretic formulation of denoising diffusion models, we show the diffusion models can be used in a straightforward way to estimate MI. In particular, the MI corresponds to half the gap in the Minimum Mean Square Error (MMSE) between conditional and unconditional diffusion, integrated over all Signal-to-Noise-Ratios (SNRs) in the noising process. Our approach not only passes self-consistency tests but also outperforms traditional and score-based diffusion MI estimators. Furthermore, our method leverages adaptive importance sampling to achieve scalable MI estimation, while maintaining strong performance even when the MI is high.
Similar Papers
Accurate Estimation of Mutual Information in High Dimensional Data
Data Analysis, Statistics and Probability
Makes computers understand data better, even messy data.
Mutual Information Estimation via Score-to-Fisher Bridge for Nonlinear Gaussian Noise Channels
Information Theory
Helps computers understand messy signals better.
Information-Theoretic Discrete Diffusion
Machine Learning (CS)
Improves AI's ability to guess missing words.