Two-Parameter Rényi Information Quantities with Applications to Privacy Amplification and Soft Covering
By: Shi-Bing Li, Ke Li, Lei Yu
Potential Business Impact:
Makes information math work better for privacy.
There are no universally accepted definitions of R\'enyi conditional entropy and R\'enyi mutual information, although motivated by different applications, several definitions have been proposed in the literature. In this paper, we consider a family of two-parameter R\'enyi conditional entropy and a family of two-parameter R\'enyi mutual information. By performing a change of variables for the parameters, the two-parameter R\'enyi conditional entropy we study coincides precisely with the definition introduced by Hayashi and Tan [IEEE Trans. Inf. Theory, 2016], and it also emerges naturally as the classical specialization of the three-parameter quantum R\'enyi conditional entropy recently put forward by Rubboli, Goodarzi, and Tomamichel [arXiv:2410.21976 (2024)]. We establish several fundamental properties of the two-parameter R\'enyi conditional entropy, including monotonicity with respect to the parameters and variational expression. The associated two-parameter R\'enyi mutual information considered in this paper is new and it unifies three commonly used variants of R\'enyi mutual information. For this quantity, we prove several important properties, including the non-negativity, additivity, data processing inequality, monotonicity with respect to the parameters, variational expression, as well as convexity and concavity. Finally, we demonstrate that these two-parameter R\'enyi information quantities can be used to characterize the strong converse exponents in privacy amplification and soft covering problems under R\'enyi divergence of order $\alpha \in (0, \infty)$.
Similar Papers
Entropic Isoperimetric and Cramér--Rao Inequalities for Rényi--Fisher Information
Information Theory
Makes math tools better for understanding information.
Generalized informational functionals and new monotone measures of statistical complexity
Information Theory
Finds new ways to measure information in data.
Deviation Inequalities for Rényi Divergence Estimators via Variational Expression
Information Theory
Makes computer learning more accurate and reliable.