Two tales for a geometric Jensen--Shannon divergence
By: Frank Nielsen
Potential Business Impact:
Makes math tools work better for new problems.
The geometric Jensen--Shannon divergence (G-JSD) gained popularity in machine learning and information sciences thanks to its closed-form expression between Gaussian distributions. In this work, we introduce an alternative definition of the geometric Jensen--Shannon divergence tailored to positive densities which does not normalize geometric mixtures. This novel divergence is termed the extended G-JSD as it extends to more general positive measures. We give explicitly the gap between the extended G-JSD and G-JSD when considering probability densities, and report both lower and upper bounds in terms of other statistical divergences. We derive corresponding closed-form expressions when considering the case of multivariate Gaussian distributions often met in applications. Finally, we show that these two types of geometric JSDs, the G-JSD and the extended G-JSD, can be interpreted as regularizations of the ordinary JSD by additive terms.
Similar Papers
Two tales for a geometric Jensen--Shannon divergence
Information Theory
Improves math for computers learning from data.
Geometric Jensen-Shannon Divergence Between Gaussian Measures On Hilbert Space
Probability
Measures how different two "probability clouds" are.
Connecting Jensen-Shannon and Kullback-Leibler Divergences: A New Bound for Representation Learning
Machine Learning (CS)
Helps computers learn what's important from data.