Testing (Conditional) Mutual Information
By: Jan Seyfried, Sayantan Sen, Marco Tomamichel
Potential Business Impact:
Finds hidden connections between three things.
We investigate the sample complexity of mutual information and conditional mutual information testing. For conditional mutual information testing, given access to independent samples of a triple of random variables $(A, B, C)$ with unknown distribution, we want to distinguish between two cases: (i) $A$ and $C$ are conditionally independent, i.e., $I(A\!:\!C|B) = 0$, and (ii) $A$ and $C$ are conditionally dependent, i.e., $I(A\!:\!C|B) \geq \varepsilon$ for some threshold $\varepsilon$. We establish an upper bound on the number of samples required to distinguish between the two cases with high confidence, as a function of $\varepsilon$ and the three alphabet sizes. We conjecture that our bound is tight and show that this is indeed the case in several parameter regimes. For the special case of mutual information testing (when $B$ is trivial), we establish the necessary and sufficient number of samples required up to polylogarithmic terms. Our technical contributions include a novel method to efficiently simulate weakly correlated samples from the conditionally independent distribution $P_{A|B} P_{C|B} P_B$ given access to samples from an unknown distribution $P_{ABC}$, and a new estimator for equivalence testing that can handle such correlated samples, which might be of independent interest.
Similar Papers
The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints
Information Theory
Makes computers learn faster with less data.
The problem of infinite information flow
Dynamical Systems
Shows how much one thing affects another.
Non-iid hypothesis testing: from classical to quantum
Quantum Physics
Helps computers tell apart similar things better.