Multiscale Asymptotic Normality in Quantile Regression: Hilbert Matrices and Polynomial Designs
By: Saïd Maanan, Azzouz Dermoune, Ahmed El Ghini
Potential Business Impact:
Makes math predictions more accurate with messy data.
This paper investigates the asymptotic properties of quantile regression estimators in linear models, with a particular focus on polynomial regressors and robustness to heavy-tailed noise. Under independent and identically distributed (i.i.d.) errors with continuous density around the quantile of interest, we establish a general Central Limit Theorem (CLT) for the quantile regression estimator under normalization using $\Delta_n^{-1}$, yielding asymptotic normality with variance $\tau(1-\tau)/f^2(0) \cdot D_0^{-1}$. In the specific case of polynomial regressors, we show that the design structure induces a Hilbert matrix in the asymptotic covariance, and we derive explicit scaling rates for each coefficient. This generalizes Pollard's and Koenker's earlier results on LAD regression to arbitrary quantile levels $\tau \in (0, 1)$. We also examine the convergence behavior of the estimators and propose a relaxation of the standard CLT-based confidence intervals, motivated by a theoretical inclusion principle. This relaxation replaces the usual $T^{j+1/2}$ scaling with $T^\alpha$, for $\alpha < j + 1/2$, to improve finite-sample coverage. Through extensive simulations under Laplace, Gaussian, and Cauchy noise, we validate this approach and highlight the improved robustness and empirical accuracy of relaxed confidence intervals. This study provides both a unifying theoretical framework and practical inference tools for quantile regression under structured regressors and heavy-tailed disturbances.
Similar Papers
On eigenvalues of a renormalized sample correlation matrix
Statistics Theory
Finds if data is related, even with lots of info.
Asymptotic distributions of four linear hypotheses test statistics under generalized spiked model
Statistics Theory
Tests if data patterns are real or random.
Asymptotic Distribution of Low-Dimensional Patterns Induced by Non-Differentiable Regularizers under General Loss Functions
Statistics Theory
Finds hidden patterns in data, even complex ones.