Score: 0

Two-Parameter Rényi Information Quantities with Applications to Privacy Amplification and Soft Covering

Published: November 4, 2025 | arXiv ID: 2511.02297v1

By: Shi-Bing Li, Ke Li, Lei Yu

Potential Business Impact:

Makes information math work better for privacy.

Business Areas:
A/B Testing Data and Analytics

There are no universally accepted definitions of R\'enyi conditional entropy and R\'enyi mutual information, although motivated by different applications, several definitions have been proposed in the literature. In this paper, we consider a family of two-parameter R\'enyi conditional entropy and a family of two-parameter R\'enyi mutual information. By performing a change of variables for the parameters, the two-parameter R\'enyi conditional entropy we study coincides precisely with the definition introduced by Hayashi and Tan [IEEE Trans. Inf. Theory, 2016], and it also emerges naturally as the classical specialization of the three-parameter quantum R\'enyi conditional entropy recently put forward by Rubboli, Goodarzi, and Tomamichel [arXiv:2410.21976 (2024)]. We establish several fundamental properties of the two-parameter R\'enyi conditional entropy, including monotonicity with respect to the parameters and variational expression. The associated two-parameter R\'enyi mutual information considered in this paper is new and it unifies three commonly used variants of R\'enyi mutual information. For this quantity, we prove several important properties, including the non-negativity, additivity, data processing inequality, monotonicity with respect to the parameters, variational expression, as well as convexity and concavity. Finally, we demonstrate that these two-parameter R\'enyi information quantities can be used to characterize the strong converse exponents in privacy amplification and soft covering problems under R\'enyi divergence of order $\alpha \in (0, \infty)$.

Page Count
35 pages

Category
Computer Science:
Information Theory