Score: 0

A Quantitative Entropy Power Inequality for Dependent Random Vectors

Published: December 22, 2025 | arXiv ID: 2512.19002v1

By: Mokshay Madiman, James Melbourne, Cyril Roberto

The entropy power inequality for independent random vectors is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Several extensions of the entropy power inequality have been developed for settings with dependence, including by Takano, Johnson, and Rioul. We extend these works by developing a quantitative version of the entropy power inequality for dependent random vectors. A notable consequence is that an entropy power inequality stated using conditional entropies holds for random vectors whose joint density is log-supermodular.

Category
Computer Science:
Information Theory