Multi-environment Invariance Learning with Missing Data
By: Yiran Jia
Potential Business Impact:
Finds stable patterns even with missing data.
Learning models that can handle distribution shifts is a key challenge in domain generalization. Invariance learning, an approach that focuses on identifying features invariant across environments, improves model generalization by capturing stable relationships, which may represent causal effects when the data distribution is encoded within a structural equation model (SEM) and satisfies modularity conditions. This has led to a growing body of work that builds on invariance learning, leveraging the inherent heterogeneity across environments to develop methods that provide causal explanations while enhancing robust prediction. However, in many practical scenarios, obtaining complete outcome data from each environment is challenging due to the high cost or complexity of data collection. This limitation in available data hinders the development of models that fully leverage environmental heterogeneity, making it crucial to address missing outcomes to improve both causal insights and robust prediction. In this work, we derive an estimator from the invariance objective under missing outcomes. We establish non-asymptotic guarantees on variable selection property and $\ell_2$ error convergence rates, which are influenced by the proportion of missing data and the quality of imputation models across environments. We evaluate the performance of the new estimator through extensive simulations and demonstrate its application using the UCI Bike Sharing dataset to predict the count of bike rentals. The results show that despite relying on a biased imputation model, the estimator is efficient and achieves lower prediction error, provided the bias is within a reasonable range.
Similar Papers
Environment Inference for Learning Generalizable Dynamical System
Machine Learning (CS)
Finds hidden patterns in data without knowing the environment.
Invariance Pair-Guided Learning: Enhancing Robustness in Neural Networks
Machine Learning (CS)
Teaches computers to learn without being fooled.
Invariant Learning with Annotation-free Environments
Machine Learning (CS)
Finds hidden patterns to make AI work anywhere.