A Quantitative Entropy Power Inequality for Dependent Random Vectors
By: Mokshay Madiman, James Melbourne, Cyril Roberto
The entropy power inequality for independent random vectors is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Several extensions of the entropy power inequality have been developed for settings with dependence, including by Takano, Johnson, and Rioul. We extend these works by developing a quantitative version of the entropy power inequality for dependent random vectors. A notable consequence is that an entropy power inequality stated using conditional entropies holds for random vectors whose joint density is log-supermodular.
Similar Papers
Structural Properties of Entropic Vectors and Stability of the Ingleton Inequality
Information Theory
Makes information sharing more secure and reliable.
A reverse entropy power inequality for i.i.d. log-concave random variables
Probability
Makes math smarter for predicting random events.
Conditions for equality and stability in Shannon's and Tao's entropy power inequalities
Probability
Proves when random data is "normal" or "special."