A Finite-Sample Strong Converse for Binary Hypothesis Testing via (Reverse) Rényi Divergence
By: Roberto Bruno, Adrien Vandenbroucque, Amedeo Roberto Esposito
This work investigates binary hypothesis testing between $H_0\sim P_0$ and $H_1\sim P_1$ in the finite-sample regime under asymmetric error constraints. By employing the ``reverse" Rényi divergence, we derive novel non-asymptotic bounds on the Type II error probability which naturally establish a strong converse result. Furthermore, when the Type I error is constrained to decay exponentially with a rate $c$, we show that the Type II error converges to 1 exponentially fast if $c$ exceeds the Kullback-Leibler divergence $D(P_1\|P_0)$, and vanishes exponentially fast if $c$ is smaller. Finally, we present numerical examples demonstrating that the proposed converse bounds strictly improve upon existing finite-sample results in the literature.
Similar Papers
Deviation Inequalities for Rényi Divergence Estimators via Variational Expression
Information Theory
Makes computer learning more accurate and reliable.
Deviation Inequalities for Rényi Divergence Estimators via Variational Expression
Information Theory
Measures how different two sets of information are.
The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints
Information Theory
Makes computers learn faster with less data.