Score: 1

When Shift Happens - Confounding Is to Blame

Published: May 27, 2025 | arXiv ID: 2505.21422v1

By: Abbavaram Gowtham Reddy , Celia Rubio-Madrigal , Rebekka Burkholz and more

Potential Business Impact:

Makes computers learn better even when data changes.

Business Areas:
A/B Testing Data and Analytics

Distribution shifts introduce uncertainty that undermines the robustness and generalization capabilities of machine learning models. While conventional wisdom suggests that learning causal-invariant representations enhances robustness to such shifts, recent empirical studies present a counterintuitive finding: (i) empirical risk minimization (ERM) can rival or even outperform state-of-the-art out-of-distribution (OOD) generalization methods, and (ii) its OOD generalization performance improves when all available covariates, not just causal ones, are utilized. Drawing on both empirical and theoretical evidence, we attribute this phenomenon to hidden confounding. Shifts in hidden confounding induce changes in data distributions that violate assumptions commonly made by existing OOD generalization approaches. Under such conditions, we prove that effective generalization requires learning environment-specific relationships, rather than relying solely on invariant ones. Furthermore, we show that models augmented with proxies for hidden confounders can mitigate the challenges posed by hidden confounding shifts. These findings offer new theoretical insights and practical guidance for designing robust OOD generalization algorithms and principled covariate selection strategies.

Page Count
49 pages

Category
Computer Science:
Machine Learning (CS)