A Distributionally-Robust Framework for Nuisance in Causal Effect Estimation
By: Akira Tanimoto
Potential Business Impact:
Helps computers learn fair decisions from messy data.
Causal inference requires evaluating models on balanced distributions between treatment and control groups, while training data often exhibits imbalance due to historical decision-making policies. Most conventional statistical methods address this distribution shift through inverse probability weighting (IPW), which requires estimating propensity scores as an intermediate step. These methods face two key challenges: inaccurate propensity estimation and instability from extreme weights. We decompose the generalization error to isolate these issues--propensity ambiguity and statistical instability--and address them through an adversarial loss function. Our approach combines distributionally robust optimization for handling propensity uncertainty with weight regularization based on weighted Rademacher complexity. Experiments on synthetic and real-world datasets demonstrate consistent improvements over existing methods.
Similar Papers
Low-rank Covariate Balancing Estimators under Interference
Methodology
Helps understand how things affect each other when connected.
Outcome-Informed Weighting for Robust ATE Estimation
Methodology
Helps find true causes in messy data.
A Generative Framework for Causal Estimation via Importance-Weighted Diffusion Distillation
Machine Learning (CS)
Helps doctors pick best treatments for each person.