Score: 0

Extropy Rate: Properties and Application in Feature Selection

Published: July 15, 2025 | arXiv ID: 2507.11242v1

By: Naveen Kumar, Vivek Vijay

Potential Business Impact:

Measures information in data to pick best clues.

Business Areas:
A/B Testing Data and Analytics

Extropy, a complementary dual of entropy, (proposed by Lad et al. \cite{lad2015extropy} in 2015) has attracted considerable interest from the research community. In this study, we focus on discrete random variables and define conditional extropy, establishing key properties of joint and conditional extropy such as bounds, uncertainty reduction due to additional information, and Lipschitz continuity. We further introduce the concept of extropy rate for a stochastic process of discrete random variables as a measure of the average uncertainty per random variable within the process. It is observed that for infinite stationary and ergodic stochastic processes, as well as for identically and independently distributed sequences, the extropy rate exhibits asymptotic equivalence. We explore the extropy rate for finite stochastic processes and numerically illustrate its effectiveness in capturing the underlying information across various distributions, quantifying complexity in time series data, and characterizing chaotic dynamics in dynamical systems. The behaviour of estimated extropy rate is observed to be closely aligned with Simpson's diversity index. The real-life applicability of the extropy rate is presented through a novel feature selection method based on the fact that features with higher extropy rates contain greater inherent information. Using six publicly available datasets, we show the superiority of the proposed feature selection method over some other existing popular approaches.

Country of Origin
🇮🇳 India

Page Count
28 pages

Category
Computer Science:
Information Theory