Score: 1

CausalPFN: Amortized Causal Effect Estimation via In-Context Learning

Published: June 9, 2025 | arXiv ID: 2506.07918v1

By: Vahid Balazadeh , Hamidreza Kamkari , Valentin Thomas and more

Potential Business Impact:

Finds what causes what from data automatically.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Causal effect estimation from observational data is fundamental across various applications. However, selecting an appropriate estimator from dozens of specialized methods demands substantial manual effort and domain expertise. We present CausalPFN, a single transformer that amortizes this workflow: trained once on a large library of simulated data-generating processes that satisfy ignorability, it infers causal effects for new observational datasets out-of-the-box. CausalPFN combines ideas from Bayesian causal inference with the large-scale training protocol of prior-fitted networks (PFNs), learning to map raw observations directly to causal effects without any task-specific adjustment. Our approach achieves superior average performance on heterogeneous and average treatment effect estimation benchmarks (IHDP, Lalonde, ACIC). Moreover, it shows competitive performance for real-world policy making on uplift modeling tasks. CausalPFN provides calibrated uncertainty estimates to support reliable decision-making based on Bayesian principles. This ready-to-use model does not require any further training or tuning and takes a step toward automated causal inference (https://github.com/vdblm/CausalPFN).

Country of Origin
🇨🇦 Canada


Page Count
31 pages

Category
Computer Science:
Machine Learning (CS)