Bayesian Smoothed Quantile Regression
By: Bingqi Liu, Kangqiang Li, Tianxiao Pang
Potential Business Impact:
Improves computer predictions for rare events.
Bayesian quantile regression (BQR) based on the asymmetric Laplace distribution (ALD) has two fundamental limitations: its posterior mean yields biased quantile estimates, and the non-differentiable check loss precludes gradient-based MCMC methods. We propose Bayesian smoothed quantile regression (BSQR), a principled reformulation that constructs a novel, continuously differentiable likelihood from a kernel-smoothed check loss, simultaneously ensuring a consistent posterior by aligning the inferential target with the smoothed objective and enabling efficient Hamiltonian Monte Carlo (HMC) sampling. Our theoretical analysis establishes posterior propriety for various priors and examines the impact of kernel choice. Simulations show BSQR reduces predictive check loss by up to 50% at extreme quantiles over ALD-based methods and improves MCMC efficiency by 20-40% in effective sample size. An application to financial risk during the COVID-19 era demonstrates superior tail risk modeling. The BSQR framework offers a theoretically grounded, computationally efficient solution to longstanding challenges in BQR, with uniform and triangular kernels emerging as highly effective.
Similar Papers
Bayesian Smoothed Quantile Regression
Methodology
Makes predictions more accurate for rare events.
Bayesian Smoothed Quantile Regression
Methodology
Predicts extreme risks faster and more accurately
Quantile regression with generalized multiquadric loss function
Methodology
Makes computer math faster for complex data.