Non-Parametric Goodness-of-Fit Tests Using Tsallis Entropy Measures
By: Mehmet Sıddık Çadırcı
Potential Business Impact:
Finds patterns in messy data better.
In this paper, we investigate new procedures for statistical testing based on Tsallis entropy, a parametric generalization of Shannon entropy. Focusing on multivariate generalized Gaussian and $q$-Gaussian distributions, we develop entropy-based goodness-of-fit tests based on maximum entropy formulations and nearest neighbour entropy estimators. Furthermore, we propose a novel iterative approach for estimating the shape parameters of the distributions, which is crucial for practical inference. This method extends entropy estimation techniques beyond traditional approaches, improving precision in heavy-tailed and non-Gaussian contexts. The numerical experiments are demonstrative of the statistical properties and convergence behaviour of the proposed tests. These findings are important for disciplines that require robust distributional tests, such as machine learning, signal processing, and information theory.
Similar Papers
Goodness-of-fit testing of the distribution of posterior classification probabilities for validating model-based clustering
Statistics Theory
Checks if computer groups data correctly.
On estimation of weighted cumulative residual Tsallis entropy
Statistics Theory
Finds patterns in data to test things.
Strong uniform consistency of nonparametric estimation for quantile-based entropy function under length-biased sampling
Methodology
Measures information in tricky, biased samples.