Strengthening Han's Fourier Entropy-Influence Inequality via an Information-Theoretic Proof
By: Peijie Li, Guangyue Han
Potential Business Impact:
Makes math rules work for more kinds of problems.
We strengthen Han's Fourier entropy-influence inequality $$ H[\widehat{f}] \leq C_{1}I(f) + C_{2}\sum_{i\in [n]}I_{i}(f)\ln\frac{1}{I_{i}(f)} $$ originally proved for $\{-1,1\}$-valued Boolean functions with $C_{1}=3+2\ln 2$ and $C_{2}=1$. We show, by a short information-theoretic proof, that it in fact holds with sharp constants $C_{1}=C_{2}=1$ for all real-valued Boolean functions of unit $L^{2}$-norm, thereby establishing the inequality as an elementary structural property of Shannon entropy and influence.
Similar Papers
Structural Properties of Entropic Vectors and Stability of the Ingleton Inequality
Information Theory
Makes information sharing more secure and reliable.
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.
Inequalities in Fourier analysis on binary cubes
Classical Analysis and ODEs
Finds math rules for digital information.