Score: 0

Rényi Divergences in Central Limit Theorems: Old and New

Published: March 5, 2025 | arXiv ID: 2503.03926v1

By: S. G. Bobkov, G. P. Chistyakov, F. Götze

Potential Business Impact:

Makes math rules work for more kinds of data.

Business Areas:
A/B Testing Data and Analytics

We give an overview of various results and methods related to information-theoretic distances of R\'enyi type in the light of their applications to the central limit theorem (CLT). The first part (Sections 1-9) is devoted to the total variation and the Kullback-Leibler distance (relative entropy). In the second part (Sections 10-15) we discuss general properties of R\'enyi and Tsallis divergences of order $\alpha>1$, and then in the third part (Sections 16-21) we turn to the CLT and non-uniform local limit theorems with respect to these strong distances. In the fourth part (Sections 22-31), we discuss recent results on strictly subgaussian distributions and describe necessary and sufficient conditions which ensure the validity of the CLT with respect to the R\'enyi divergence of infinite order.

Country of Origin
🇺🇸 United States

Page Count
66 pages

Category
Computer Science:
Information Theory