Score: 0

Bounds of Shannon entropy and Extropy and their application in exploring the extreme value behavior of a large set of data

Published: July 18, 2025 | arXiv ID: 2507.13656v1

By: Konstantinos Zografos

Potential Business Impact:

Finds best ways to measure data surprise.

Business Areas:
Big Data Data and Analytics

This paper derives bounds for two omnipresent information theoretic measures, the Shannon entropy and its complementary dual, the extropy. Based on a large size data set from a logconcave model, the said bounds are obtained for the entropy and the extropy of the distribution of the largest order statistic and the respective normalized sequence, in the extreme value theory setting. A characterization of the exponential distribution is provided as the model that maximizes the Shannon entropy and the extropy which are associated with the distribution of the maximum value, in a large sample size regime. This characterization is exploited to provide an alternative, immediate proof of the convergence of Shannon entropy and extropy of the normalized maxima of a large size sample to the respective measures for the Gumbel distribution, studied recently for Shannon entropy in Johnson (2024) and references therein.

Country of Origin
🇬🇷 Greece

Page Count
27 pages

Category
Mathematics:
Statistics Theory