Inequalities in Fourier analysis on binary cubes
By: Tonći Crmarić, Vjekoslav Kovač, Shobu Shiraki
Potential Business Impact:
Finds math rules for digital information.
This paper studies two classical inequalities, namely the Hausdorff-Young inequality and equal-exponent Young's convolution inequality, for discrete functions supported in the binary cube $\{0,1\}^d\subset\mathbb{Z}^d$. We characterize the exact ranges of Lebesgue exponents in which sharp versions of these two inequalities hold, and present several immediate consequences. First, if the functions are specialized to be the indicator of some set $A\subseteq\{0,1\}^d$, then we obtain sharp upper bounds on two types of generalized additive energies of $A$, extending the works of Kane-Tao, de Dios Pont-Greenfeld-Ivanisvili-Madrid, and one of the present authors. Second, we obtain a sharp binary variant of the Beckner-Hirschman entropic uncertainty principle, as well as a sharp lower estimate on the entropy of a sum of two independent random variables with values in $\{0,1\}^d$. Finally, the sharp binary Hausdorff-Young inequality also reveals the exact range of dimension-free estimates for the Fourier restriction to the binary cube.
Similar Papers
Entropic versions of Bergström's and Bonnesen's inequalities
Information Theory
Makes math ideas about information more accurate.
Strengthening Han's Fourier Entropy-Influence Inequality via an Information-Theoretic Proof
Information Theory
Makes math rules work for more kinds of problems.
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.