Score: 0

Source-Condition Analysis of Kernel Adversarial Estimators

Published: August 24, 2025 | arXiv ID: 2508.17181v1

By: Antonio Olivas-Martinez, Andrea Rotnitzky

Potential Business Impact:

Improves computer learning from tricky data.

Business Areas:
A/B Testing Data and Analytics

In many applications, the target parameter depends on a nuisance function defined by a conditional moment restriction, whose estimation often leads to an ill-posed inverse problem. Classical approaches, such as sieve-based GMM, approximate the restriction using a fixed set of test functions and may fail to capture important aspects of the solution. Adversarial estimators address this limitation by framing estimation as a game between an estimator and an adaptive critic. We study the class of Regularized Adversarial Stabilized (RAS) estimators that employ reproducing kernel Hilbert spaces (RKHSs) for both estimation and testing, with regularization via the RKHS norm. Our first contribution is a novel analysis that establishes finite-sample bounds for both the weak error and the root mean squared error (RMSE) of these estimators under interpretable source conditions, in contrast to existing results. Our second contribution is a detailed comparison of the assumptions underlying this RKHS-norm-regularized approach with those required for (i) RAS estimators using $\mathcal{L}^2$ penalties, and (ii) recently proposed, computationally stable Kernel Maximal Moment estimators.

Page Count
54 pages

Category
Mathematics:
Statistics Theory