Robust Inference for Convex Pairwise Difference Estimators
By: Matias D. Cattaneo, Michael Jansson, Kenichi Nagasawa
Potential Business Impact:
Makes computer predictions more accurate with less data.
This paper develops distribution theory and bootstrap-based inference methods for a broad class of convex pairwise difference estimators. These estimators minimize a kernel-weighted convex-in-parameter function over observation pairs that are similar in terms of certain covariates, where the similarity is governed by a localization (bandwidth) parameter. While classical results establish asymptotic normality under restrictive bandwidth conditions, we show that valid Gaussian and bootstrap-based inference remains possible under substantially weaker assumptions. First, we extend the theory of small bandwidth asymptotics to convex pairwise estimation settings, deriving robust Gaussian approximations even when a smaller than standard bandwidth is used. Second, we employ a debiasing procedure based on generalized jackknifing to enable inference with larger bandwidths, while preserving convexity of the objective function. Third, we construct a novel bootstrap method that adjusts for bandwidth-induced variance distortions, yielding valid inference across a wide range of bandwidth choices. Our proposed inference method enjoys demonstrable more robustness, while retaining the practical appeal of convex pairwise difference estimators.
Similar Papers
A geometric ensemble method for Bayesian inference
Optimization and Control
Makes computers guess better about hidden things.
Inference in pseudo-observation-based regression using (biased) covariance estimation and naive bootstrapping
Methodology
Fixes math mistakes in computer predictions.
Nonparametric hazard rate estimation with associated kernels and minimax bandwidth choice
Statistics Theory
Predicts how long things will last.