Smoothed Quantile Estimation: A Unified Framework Interpolating to the Mean
By: Saïd Maanan, Azzouz Dermoune, Ahmed El Ghini
This paper develops and analyzes three families of estimators that continuously interpolate between classical quantiles and the sample mean. The construction begins with a smoothed version of the $L_{1}$ loss, indexed by a location parameter $z$ and a smoothing parameter $h \ge 0$, whose minimizer $\hat q(z,h)$ yields a unified M-estimation framework. Depending on how $(z, h)$ is specified, this framework generates three distinct classes of estimators: fixed-parameter smoothed quantile estimators, plug-in estimators of fixed quantiles, and a new continuum of mean-estimating procedures. For all three families we establish consistency and asymptotic normality via a uniform asymptotic equicontinuity argument. The limiting variances admit closed forms, allowing a transparent comparison of efficiency across families and smoothing levels. A geometric decomposition of the parameter space shows that, for fixed quantile level $τ$, admissible pairs $(z, h)$ lie on straight lines along which the estimator targets the same population quantile while its asymptotic variance evolves. The theoretical analysis reveals two efficiency regimes. Under light-tailed distributions (e.g., Gaussian), smoothing yields a monotone variance reduction. Under heavy-tailed distributions (e.g., Laplace), a finite smoothing parameter $h^{*}(τ) > 0$ strictly improves efficiency for quantile estimation. Numerical experiments -- based on simulated data and real financial returns -- validate these conclusions and show that, both asymptotically and in finite samples, the mean-estimating family does not improve upon the sample mean.
Similar Papers
Smoothed Quantile Estimation via Interpolation to the Mean
Methodology
Helps find better averages for all kinds of numbers.
Robustified Gaussian quasi-likelihood inference for volatility
Statistics Theory
Makes computer models work even with bad data.
Multiscale Asymptotic Normality in Quantile Regression: Hilbert Matrices and Polynomial Designs
Statistics Theory
Makes math predictions more accurate with messy data.