Rényi Divergences in Central Limit Theorems: Old and New
By: S. G. Bobkov, G. P. Chistyakov, F. Götze
Potential Business Impact:
Makes math rules work for more kinds of data.
We give an overview of various results and methods related to information-theoretic distances of R\'enyi type in the light of their applications to the central limit theorem (CLT). The first part (Sections 1-9) is devoted to the total variation and the Kullback-Leibler distance (relative entropy). In the second part (Sections 10-15) we discuss general properties of R\'enyi and Tsallis divergences of order $\alpha>1$, and then in the third part (Sections 16-21) we turn to the CLT and non-uniform local limit theorems with respect to these strong distances. In the fourth part (Sections 22-31), we discuss recent results on strictly subgaussian distributions and describe necessary and sufficient conditions which ensure the validity of the CLT with respect to the R\'enyi divergence of infinite order.
Similar Papers
A new group of transformations related to the Kullback-Leibler and Rényi divergences and universal classes of monotone measures of statistical complexity
Mathematical Physics
Makes math models change smoothly and predictably.
CLT for LES of real valued random centrosymmetric matrices
Probability
Finds math patterns in complex data.
Deviation Inequalities for Rényi Divergence Estimators via Variational Expression
Information Theory
Makes computer learning more accurate and reliable.