Demystifying Spectral Feature Learning for Instrumental Variable Regression
By: Dimitri Meunier , Antoine Moulin , Jakub Wornbard and more
Potential Business Impact:
Finds hidden causes of effects in data.
We address the problem of causal effect estimation in the presence of hidden confounders, using nonparametric instrumental variable (IV) regression. A leading strategy employs spectral features - that is, learned features spanning the top eigensubspaces of the operator linking treatments to instruments. We derive a generalization error bound for a two-stage least squares estimator based on spectral features, and gain insights into the method's performance and failure modes. We show that performance depends on two key factors, leading to a clear taxonomy of outcomes. In a good scenario, the approach is optimal. This occurs with strong spectral alignment, meaning the structural function is well-represented by the top eigenfunctions of the conditional operator, coupled with this operator's slow eigenvalue decay, indicating a strong instrument. Performance degrades in a bad scenario: spectral alignment remains strong, but rapid eigenvalue decay (indicating a weaker instrument) demands significantly more samples for effective feature learning. Finally, in the ugly scenario, weak spectral alignment causes the method to fail, regardless of the eigenvalues' characteristics. Our synthetic experiments empirically validate this taxonomy.
Similar Papers
Outcome-Aware Spectral Feature Learning for Instrumental Variable Regression
Machine Learning (Stat)
Finds hidden causes even when data is tricky.
Spectral decomposition-assisted multi-study factor analysis
Methodology
Finds common patterns across different science studies.
Spectral Estimators for Multi-Index Models: Precise Asymptotics and Optimal Weak Recovery
Machine Learning (Stat)
Finds hidden patterns in data faster.