Score: 1

Partitioning the Sample Space for a More Precise Shannon Entropy Estimation

Published: December 10, 2025 | arXiv ID: 2512.10133v1

By: Gabriel F. A. Bastos, Jugurta Montalvão

Potential Business Impact:

Helps guess hidden information from limited data.

Business Areas:
A/B Testing Data and Analytics

Reliable data-driven estimation of Shannon entropy from small data sets, where the number of examples is potentially smaller than the number of possible outcomes, is a critical matter in several applications. In this paper, we introduce a discrete entropy estimator, where we use the decomposability property in combination with estimations of the missing mass and the number of unseen outcomes to compensate for the negative bias induced by them. Experimental results show that the proposed method outperforms some classical estimators in undersampled regimes, and performs comparably with some well-established state-of-the-art estimators.

Country of Origin
🇧🇷 Brazil

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)