Contraction of Rényi Divergences for Discrete Channels: Properties and Applications
By: Adrien Vandenbroucque, Amedeo Roberto Esposito, Michael Gastpar
This work explores properties of Strong Data-Processing constants for Rényi Divergences. Parallels are made with the well-studied $\varphi$-Divergences, and it is shown that the order $α$ of Rényi Divergences dictates whether certain properties of the contraction of $\varphi$-Divergences are mirrored or not. In particular, we demonstrate that when $α>1$, the contraction properties can deviate quite strikingly from those of $\varphi$-Divergences. We also uncover specific characteristics of contraction for the $\infty$-Rényi Divergence and relate it to $\varepsilon$-Local Differential Privacy. The results are then applied to bound the speed of convergence of Markov chains, where we argue that the contraction of Rényi Divergences offers a new perspective on the contraction of $L^α$-norms commonly studied in the literature.
Similar Papers
Bounds on the privacy amplification of arbitrary channels via the contraction of $f_α$-divergence
Information Theory
Protects your private data even with leaky systems.
Non-Linear Strong Data-Processing for Quantum Hockey-Stick Divergences
Quantum Physics
Makes secret messages safer in quantum computers.
Some properties and applications of the new quantum $f$-divergences
Quantum Physics
Improves how we measure information in quantum computers.