Extropy-Based Generalized Divergence and Similarity Ratios: Theory and Applications
By: Saranya P., Sunoj S. M
Potential Business Impact:
Compares how likely things are to happen.
In this article, we propose two classes of relative information measures based on extropy, viz., the generalized extropy similarity ratio (GESR) and generalized extropy divergence ratio (GEDR), that measure the similarity and discrepancy between two probability distributions, respectively. Definitions of GESR and GEDR are proposed along with their fundamental axioms, properties, and some measures satisfying those axioms are also introduced. The relationship of GESR with the popular cosine similarity is also established in the study. Various properties of GESR and GEDR, including bounds under the proportional hazards model and the proportional reversed hazards model, are derived. Nonparametric estimators of GESR are defined, and their performance is evaluated using simulation studies. Applications of the GESR in lifetime data analysis and image analysis are also demonstrated in this study.
Similar Papers
Further results on relative, divergence measures based on extropy and their applications
Applications
Measures how much information is missing from data.
Distributional Evaluation of Generative Models via Relative Density Ratio
Methodology
Checks if computer-made pictures look real.
Two tales for a geometric Jensen--Shannon divergence
Information Theory
Improves math for computers learning from data.