Score: 0

Asymptotic and finite-sample distributions of one- and two-sample empirical relative entropy, with application to change-point detection

Published: December 18, 2025 | arXiv ID: 2512.16411v1

By: Matthieu Garcin, Louis Perot

Potential Business Impact:

Finds when data changes by comparing patterns.

Business Areas:
A/B Testing Data and Analytics

Relative entropy, as a divergence metric between two distributions, can be used for offline change-point detection and extends classical methods that mainly rely on moment-based discrepancies. To build a statistical test suitable for this context, we study the distribution of empirical relative entropy and derive several types of approximations: concentration inequalities for finite samples, asymptotic distributions, and Berry-Esseen bounds in a pre-asymptotic regime. For the latter, we introduce a new approach to obtain Berry-Esseen inequalities for nonlinear functions of sum statistics under some convexity assumptions. Our theoretical contributions cover both one- and two-sample empirical relative entropies. We then detail a change-point detection procedure built on relative entropy and compare it, through extensive simulations, with classical methods based on moments or on information criteria. Finally, we illustrate its practical relevance on two real datasets involving temperature series and volatility of stock indices.

Page Count
37 pages

Category
Statistics:
Methodology