A new group of transformations related to the Kullback-Leibler and Rényi divergences and universal classes of monotone measures of statistical complexity
By: Razvan Gabriel Iagar, David Puertas-Centeno, Elio V. Toranzo
In this work we introduce a family of transformations, named \textit{divergence transformations}, interpolating between any pair of probability density functions sharing the same support. We prove the remarkable property that the whole family of Kullback-Leibler and Rényi divergences evolves in a monotone way with respect to the transformation parameter. Moreover, fixing the reference density, we show that the divergence transformations enjoy a group structure and can be derived through the algebraic conjugation of the recently introduced differential-escort transformations and their relative counterparts. This algebraic structure allows us to deform any density function in such a way its divergence with respect a fixed reference density might also increase as much as possible. We also establish the monotonicity of composed measures involving the proper Kullback-Leibler and Rényi divergences as well as other recently introduced relative measures of moment and Fisher types. As applications, an approximation scheme of general density functions by simple functions is provided. In addition, we give a number of analytical and numerical examples of interest in both regimes of increasing and decreasing divergence.
Similar Papers
Rényi Divergences in Central Limit Theorems: Old and New
Information Theory
Makes math rules work for more kinds of data.
Quantum $f$-divergences and Their Local Behaviour: An Analysis via Relative Expansion Coefficients
Quantum Physics
Makes quantum computers more reliable and accurate.
A Hierarchical Decomposition of Kullback-Leibler Divergence: Disentangling Marginal Mismatches from Statistical Dependencies
Other Computer Science
Splits data differences into separate parts.