Risk Bounds For Distributional Regression
By: Carlos Misael Madrid Padilla, Oscar Hernan Madrid Padilla, Sabyasachi Chatterjee
Potential Business Impact:
Helps predict outcomes more accurately.
This work examines risk bounds for nonparametric distributional regression estimators. For convex-constrained distributional regression, general upper bounds are established for the continuous ranked probability score (CRPS) and the worst-case mean squared error (MSE) across the domain. These theoretical results are applied to isotonic and trend filtering distributional regression, yielding convergence rates consistent with those for mean estimation. Furthermore, a general upper bound is derived for distributional regression under non-convex constraints, with a specific application to neural network-based estimators. Comprehensive experiments on both simulated and real data validate the theoretical contributions, demonstrating their practical effectiveness.
Similar Papers
Wasserstein Distributionally Robust Nonparametric Regression
Machine Learning (Stat)
Makes computer predictions better even with bad data.
Minimax Optimal Rates for Regression on Manifolds and Distributions
Statistics Theory
Learns patterns from messy, incomplete data.
On Efficient Estimation of Distributional Treatment Effects under Covariate-Adaptive Randomization
Econometrics
Improves study results by balancing groups better.