Score: 0

Rates of Convergence of Maximum Smoothed Log-Likelihood Estimators for Semi-Parametric Multivariate Mixtures

Published: November 6, 2025 | arXiv ID: 2511.04226v1

By: Marie Du Roy de Chaumaray, Michael Levine, Matthieu Marbac

Potential Business Impact:

Makes smart guesses about mixed data more reliable.

Business Areas:
A/B Testing Data and Analytics

Theoretical guarantees are established for a standard estimator in a semi-parametric finite mixture model, where each component density is modeled as a product of univariate densities under a conditional independence assumption. The focus is on the estimator that maximizes a smoothed log-likelihood function, which can be efficiently computed using a majorization-minimization algorithm. This smoothed likelihood applies a nonlinear regularization operator defined as the exponential of a kernel convolution on the logarithm of each component density. Consistency of the estimators is demonstrated by leveraging classical M-estimation frameworks under mild regularity conditions. Subsequently, convergence rates for both finite- and infinite-dimensional parameters are derived by exploiting structural properties of the smoothed likelihood, the behavior of the iterative optimization algorithm, and a thorough study of the profile smoothed likelihood. This work provides the first rigorous theoretical guarantees for this estimation approach, bridging the gap between practical algorithms and statistical theory in semi-parametric mixture modeling.

Page Count
45 pages

Category
Mathematics:
Statistics Theory