Numerical evaluation of Gaussian mixture entropy
By: Basheer Joudeh, Boris Škorić
Potential Business Impact:
Makes guessing how data is spread easier.
We develop an approximation method for the differential entropy $h(\mathbf{X})$ of a $q$-component Gaussian mixture in $\mathbb{R}^n$. We provide two examples of approximations using our method denoted by $\bar{h}^{\mathrm{Taylor}}_{C,m}(\mathbf{X})$ and $\bar{h}^{\mathrm{Polyfit}}_{C,m}(\mathbf{X})$. We show that $\bar{h}^{\mathrm{Taylor}}_{C,m}(\mathbf{X})$ provides an easy to compute lower bound to $h(\mathbf{X})$, while $\bar{h}^{\mathrm{Polyfit}}_{C,m}(\mathbf{X})$ provides an accurate and efficient approximation to $h(\mathbf{X})$. $\bar{h}^{\mathrm{Polyfit}}_{C,m}(\mathbf{X})$ is more accurate than known bounds, and conjectured to be much more resilient than the approximation of [5] in high dimensions.
Similar Papers
Entropy approximations of algebraic matroids over finite fields
Combinatorics
Links math structures to information theory.
Non-Parametric Goodness-of-Fit Tests Using Tsallis Entropy Measures
Methodology
Finds patterns in messy data better.
Nonparametric MLE for Gaussian Location Mixtures: Certified Computation and Generic Behavior
Statistics Theory
Finds patterns in messy data faster.