Asymptotic and finite-sample distributions of one- and two-sample empirical relative entropy, with application to change-point detection
By: Matthieu Garcin, Louis Perot
Potential Business Impact:
Finds when data changes by comparing patterns.
Relative entropy, as a divergence metric between two distributions, can be used for offline change-point detection and extends classical methods that mainly rely on moment-based discrepancies. To build a statistical test suitable for this context, we study the distribution of empirical relative entropy and derive several types of approximations: concentration inequalities for finite samples, asymptotic distributions, and Berry-Esseen bounds in a pre-asymptotic regime. For the latter, we introduce a new approach to obtain Berry-Esseen inequalities for nonlinear functions of sum statistics under some convexity assumptions. Our theoretical contributions cover both one- and two-sample empirical relative entropies. We then detail a change-point detection procedure built on relative entropy and compare it, through extensive simulations, with classical methods based on moments or on information criteria. Finally, we illustrate its practical relevance on two real datasets involving temperature series and volatility of stock indices.
Similar Papers
Change-Point Detection Utilizing Normalized Entropy as a Fundamental Metric
Applications
Finds sudden changes in data patterns.
Estimation of discrete distributions in relative entropy, and the deviations of the missing mass
Statistics Theory
Finds hidden patterns in data more accurately.
Accelerated optimization of measured relative entropies
Quantum Physics
Makes quantum computers calculate faster and use less memory.