Debiased Inference for High-Dimensional Regression Models Based on Profile M-Estimation
By: Yi Wang , Yuhao Deng , Yu Gu and more
Debiased inference for high-dimensional regression models has received substantial recent attention to ensure regularized estimators have valid inference. All existing methods focus on achieving Neyman orthogonality through explicitly constructing projections onto the space of nuisance parameters, which is infeasible when an explicit form of the projection is unavailable. We introduce a general debiasing framework, Debiased Profile M-Estimation (DPME), which applies to a broad class of models and does not require model-specific Neyman orthogonalization or projection derivations as in existing methods. Our approach begins by obtaining an initial estimator of the parameters by optimizing a penalized objective function. To correct for the bias introduced by penalization, we construct a one-step estimator using the Newton-Raphson update, applied to the gradient of a profile function defined as the optimal objective function with the parameter of interest held fixed. We use numerical differentiation without requiring the explicit calculation of the gradients. The resulting DPME estimator is shown to be asymptotically linear and normally distributed. Through extensive simulations, we demonstrate that the proposed method achieves better coverage rates than existing alternatives, with largely reduced computational cost. Finally, we illustrate the utility of our method with an application to estimating a treatment rule for multiple myeloma.
Similar Papers
Debiased Bayesian Inference for High-dimensional Regression Models
Econometrics
Fixes math models so they are more trustworthy.
Revisiting Penalized Likelihood Estimation for Gaussian Processes
Methodology
Improves computer predictions when data is scarce.
Direct Debiased Machine Learning via Bregman Divergence Minimization
Econometrics
Makes computer predictions more accurate and fair.