Parametric convergence rate of some nonparametric estimators in mixtures of power series distributions
By: Fadoua Balabdaoui, Harald Besdziek, Yong Wang
Potential Business Impact:
Estimates mixed count patterns accurately.
We consider the problem of estimating a mixture of power series distributions with infinite support, to which belong very well-known models such as Poisson, Geometric, Logarithmic or Negative Binomial probability mass functions. We consider the nonparametric maximum likelihood estimator (NPMLE) and show that, under very mild assumptions, it converges to the true mixture distribution $\pi_0$ at a rate no slower than $(\log n)^{3/2} n^{-1/2}$ in the Hellinger distance. Recent work on minimax lower bounds suggests that the logarithmic factor in the obtained Hellinger rate of convergence can not be improved, at least for mixtures of Poisson distributions. Furthermore, we construct nonparametric estimators that are based on the NPMLE and show that they converge to $\pi_0$ at the parametric rate $n^{-1/2}$ in the $\ell_p$-norm ($p \in [1, \infty]$ or $p \in [2, \infty])$: The weighted least squares and hybrid estimators. Simulations and a real data application are considered to assess the performance of all estimators we study in this paper and illustrate the practical aspect of the theory. The simulations results show that the NPMLE has the best performance in the Hellinger, $\ell_1$ and $\ell_2$ distances in all scenarios. Finally, to construct confidence intervals of the true mixture probability mass function, both the nonparametric and parametric bootstrap procedures are considered. Their performances are compared with respect to the coverage and length of the resulting intervals.
Similar Papers
Parametric convergence rate of a non-parametric estimator in multivariate mixtures of power series distributions under conditional independence
Statistics Theory
Finds patterns in data even when they're hidden.
Rates of Convergence of Maximum Smoothed Log-Likelihood Estimators for Semi-Parametric Multivariate Mixtures
Statistics Theory
Makes smart guesses about mixed data more reliable.
Nonparametric Inference on Unlabeled Histograms
Statistics Theory
Finds hidden patterns in data, even with missing pieces.