Think Global, Act Local: Bayesian Causal Discovery with Language Models in Sequential Data
By: Prakhar Verma , David Arbour , Sunav Choudhary and more
Potential Business Impact:
Finds hidden causes in messy, incomplete information.
Causal discovery from observational data typically assumes full access to data and availability of domain experts. In practice, data often arrive in batches, and expert knowledge is scarce. Language Models (LMs) offer a surrogate but come with their own issues-hallucinations, inconsistencies, and bias. We present BLANCE (Bayesian LM-Augmented Causal Estimation)-a hybrid Bayesian framework that bridges these gaps by adaptively integrating sequential batch data with LM-derived noisy, expert knowledge while accounting for both data-induced and LM-induced biases. Our proposed representation shift from Directed Acyclic Graph (DAG) to Partial Ancestral Graph (PAG) accommodates ambiguities within a coherent Bayesian framework, allowing grounding the global LM knowledge in local observational data. To guide LM interaction, we use a sequential optimization scheme that adaptively queries the most informative edges. Across varied datasets, BLANCE outperforms prior work in structural accuracy and extends to Bayesian parameter estimation, showing robustness to LM noise.
Similar Papers
Causal-aware Large Language Models: Enhancing Decision-Making Through Learning, Adapting and Acting
Machine Learning (CS)
Helps computers learn and make better choices.
Causal MAS: A Survey of Large Language Model Architectures for Discovery and Effect Estimation
Artificial Intelligence
Helps AI understand why things happen.
Causal Discovery from Data Assisted by Large Language Models
Materials Science
Finds new materials by connecting knowledge and data.