Score: 0

Multiscale Asymptotic Normality in Quantile Regression: Hilbert Matrices and Polynomial Designs

Published: March 19, 2025 | arXiv ID: 2503.15041v3

By: Saïd Maanan, Azzouz Dermoune, Ahmed El Ghini

Potential Business Impact:

Makes math predictions more accurate with messy data.

Business Areas:
A/B Testing Data and Analytics

This paper investigates the asymptotic properties of quantile regression estimators in linear models, with a particular focus on polynomial regressors and robustness to heavy-tailed noise. Under independent and identically distributed (i.i.d.) errors with continuous density around the quantile of interest, we establish a general Central Limit Theorem (CLT) for the quantile regression estimator under normalization using $\Delta_n^{-1}$, yielding asymptotic normality with variance $\tau(1-\tau)/f^2(0) \cdot D_0^{-1}$. In the specific case of polynomial regressors, we show that the design structure induces a Hilbert matrix in the asymptotic covariance, and we derive explicit scaling rates for each coefficient. This generalizes Pollard's and Koenker's earlier results on LAD regression to arbitrary quantile levels $\tau \in (0, 1)$. We also examine the convergence behavior of the estimators and propose a relaxation of the standard CLT-based confidence intervals, motivated by a theoretical inclusion principle. This relaxation replaces the usual $T^{j+1/2}$ scaling with $T^\alpha$, for $\alpha < j + 1/2$, to improve finite-sample coverage. Through extensive simulations under Laplace, Gaussian, and Cauchy noise, we validate this approach and highlight the improved robustness and empirical accuracy of relaxed confidence intervals. This study provides both a unifying theoretical framework and practical inference tools for quantile regression under structured regressors and heavy-tailed disturbances.

Page Count
23 pages

Category
Mathematics:
Statistics Theory