Do-PFN: In-Context Learning for Causal Effect Estimation
By: Jake Robertson , Arik Reuter , Siyuan Guo and more
Potential Business Impact:
Finds cause and effect without knowing all the rules.
Estimation of causal effects is critical to a range of scientific disciplines. Existing methods for this task either require interventional data, knowledge about the ground truth causal graph, or rely on assumptions such as unconfoundedness, restricting their applicability in real-world settings. In the domain of tabular machine learning, Prior-data fitted networks (PFNs) have achieved state-of-the-art predictive performance, having been pre-trained on synthetic data to solve tabular prediction problems via in-context learning. To assess whether this can be transferred to the harder problem of causal effect estimation, we pre-train PFNs on synthetic data drawn from a wide variety of causal structures, including interventions, to predict interventional outcomes given observational data. Through extensive experiments on synthetic case studies, we show that our approach allows for the accurate estimation of causal effects without knowledge of the underlying causal graph. We also perform ablation studies that elucidate Do-PFN's scalability and robustness across datasets with a variety of causal characteristics.
Similar Papers
CausalPFN: Amortized Causal Effect Estimation via In-Context Learning
Machine Learning (CS)
Finds what causes what from data automatically.
Foundation Models for Causal Inference via Prior-Data Fitted Networks
Machine Learning (CS)
Helps computers understand cause and effect.
FairPFN: A Tabular Foundation Model for Causal Fairness
Machine Learning (CS)
Fixes unfair computer decisions without knowing why.