Progress on the Courtade-Kumar Conjecture: Optimal High-Noise Entropy Bounds and Generalized Coordinate-wise Mutual Information
By: Adel Javanmard, David P. Woodruff
The Courtade-Kumar conjecture posits that dictatorship functions maximize the mutual information between the function's output and a noisy version of its input over the Boolean hypercube. We present two significant advancements related to this conjecture. First, we resolve an open question posed by Courtade and Kumar, proving that for any Boolean function (regardless of bias), the sum of mutual information between the function's output and the individual noisy input coordinates is bounded by $1-H(α)$, where $α$ is the noise parameter of the Binary Symmetric Channel. This generalizes their previous result which was restricted to balanced Boolean functions. Second, we advance the study of the main conjecture in the high noise regime. We establish an optimal error bound of $O(λ^2)$ for the asymptotic entropy expansion, where $λ= (1-2α)^2$, improving upon the previous best-known bounds. This refined analysis leads to a sharp, linear Fourier concentration bound for highly informative functions and significantly extends the range of the noise parameter $λ$ for which the conjecture is proven to hold.
Similar Papers
Strengthening Han's Fourier Entropy-Influence Inequality via an Information-Theoretic Proof
Information Theory
Makes math rules work for more kinds of problems.
Simple and Sharp Generalization Bounds via Lifting
Statistics Theory
Makes computer learning more accurate and faster.
The many faces of multivariate information
Information Theory
Unlocks hidden patterns in complex data.