Information Inequalities for Five Random Variables
By: E. P. Csirmaz, L. Csirmaz
The entropic region is formed by the collection of the Shannon entropies of all subvectors of finitely many jointly distributed discrete random variables. For four or more variables the structure of the entropic region is mostly unknown. We utilize a variant of the Maximum Entropy Method to delimit the five-variable entropy region. This method adds copies of some of the random variables in generations. A significant reduction in computational complexity, achieved through theoretical considerations and by harnessing the inherent symmetries, allowed us to calculate all five-variable non-Shannon inequalities provided by the first nine generations. Based on the results, we define two infinite collections of such inequalities, and prove them to be entropy inequalities. We investigate downward closed subsets of non-negative lattice points that parameterize these collections, based on which we develop an algorithm to enumerate all extremal inequalities. The discovered set of entropy inequalities is conjectured to characterize the applied method completely.
Similar Papers
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.
A Quantitative Entropy Power Inequality for Dependent Random Vectors
Information Theory
Makes information more predictable even with messy data.
Structural Properties of Entropic Vectors and Stability of the Ingleton Inequality
Information Theory
Makes information sharing more secure and reliable.