Score: 3

Think Global, Act Local: Bayesian Causal Discovery with Language Models in Sequential Data

Published: June 19, 2025 | arXiv ID: 2506.16234v1

By: Prakhar Verma , David Arbour , Sunav Choudhary and more

BigTech Affiliations: University of Washington

Potential Business Impact:

Finds hidden causes in messy, incomplete information.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Causal discovery from observational data typically assumes full access to data and availability of domain experts. In practice, data often arrive in batches, and expert knowledge is scarce. Language Models (LMs) offer a surrogate but come with their own issues-hallucinations, inconsistencies, and bias. We present BLANCE (Bayesian LM-Augmented Causal Estimation)-a hybrid Bayesian framework that bridges these gaps by adaptively integrating sequential batch data with LM-derived noisy, expert knowledge while accounting for both data-induced and LM-induced biases. Our proposed representation shift from Directed Acyclic Graph (DAG) to Partial Ancestral Graph (PAG) accommodates ambiguities within a coherent Bayesian framework, allowing grounding the global LM knowledge in local observational data. To guide LM interaction, we use a sequential optimization scheme that adaptively queries the most informative edges. Across varied datasets, BLANCE outperforms prior work in structural accuracy and extends to Bayesian parameter estimation, showing robustness to LM noise.

Country of Origin
🇺🇸 🇫🇮 United States, Finland

Page Count
24 pages

Category
Computer Science:
Machine Learning (CS)