Score: 0

Large Language Bayes

Published: April 18, 2025 | arXiv ID: 2504.14025v1

By: Justin Domke

Potential Business Impact:

Builds computer models from simple ideas.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Many domain experts do not have the time or training to write formal Bayesian models. This paper takes an informal problem description as input, and combines a large language model and a probabilistic programming language to create a joint distribution over formal models, latent variables, and data. A posterior over latent variables follows by conditioning on observed data and integrating over formal models. This presents a challenging inference problem. We suggest an inference recipe that amounts to generating many formal models from the large language model, performing approximate inference on each, and then doing a weighted average. This is justified an analyzed as a combination of self-normalized importance sampling, MCMC, and variational inference. We show that this produces sensible predictions without the need to specify a formal model.

Country of Origin
🇺🇸 United States

Page Count
57 pages

Category
Computer Science:
Machine Learning (CS)