Bounds of Shannon entropy and Extropy and their application in exploring the extreme value behavior of a large set of data
By: Konstantinos Zografos
Potential Business Impact:
Finds best ways to measure data surprise.
This paper derives bounds for two omnipresent information theoretic measures, the Shannon entropy and its complementary dual, the extropy. Based on a large size data set from a logconcave model, the said bounds are obtained for the entropy and the extropy of the distribution of the largest order statistic and the respective normalized sequence, in the extreme value theory setting. A characterization of the exponential distribution is provided as the model that maximizes the Shannon entropy and the extropy which are associated with the distribution of the maximum value, in a large sample size regime. This characterization is exploited to provide an alternative, immediate proof of the convergence of Shannon entropy and extropy of the normalized maxima of a large size sample to the respective measures for the Gumbel distribution, studied recently for Shannon entropy in Johnson (2024) and references therein.
Similar Papers
On the uniqueness of the coupled entropy
Statistical Mechanics
Measures complex system uncertainty better.
Extropy Rate: Properties and Application in Feature Selection
Information Theory
Measures information in data to pick best clues.
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.