Score: 0

A Distributionally-Robust Framework for Nuisance in Causal Effect Estimation

Published: May 23, 2025 | arXiv ID: 2505.17717v1

By: Akira Tanimoto

Potential Business Impact:

Helps computers learn fair decisions from messy data.

Business Areas:
A/B Testing Data and Analytics

Causal inference requires evaluating models on balanced distributions between treatment and control groups, while training data often exhibits imbalance due to historical decision-making policies. Most conventional statistical methods address this distribution shift through inverse probability weighting (IPW), which requires estimating propensity scores as an intermediate step. These methods face two key challenges: inaccurate propensity estimation and instability from extreme weights. We decompose the generalization error to isolate these issues--propensity ambiguity and statistical instability--and address them through an adversarial loss function. Our approach combines distributionally robust optimization for handling propensity uncertainty with weight regularization based on weighted Rademacher complexity. Experiments on synthetic and real-world datasets demonstrate consistent improvements over existing methods.

Page Count
22 pages

Category
Statistics:
Machine Learning (Stat)