Score: 1

A Bayesian Generative Modeling Approach for Arbitrary Conditional Inference

Published: January 8, 2026 | arXiv ID: 2601.05355v1

By: Qiao Liu, Wing Hung Wong

Potential Business Impact:

Lets AI answer any "what if" question about data.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Modern data analysis increasingly requires flexible conditional inference P(X_B | X_A) where (X_A, X_B) is an arbitrary partition of observed variable X. Existing conditional inference methods lack this flexibility as they are tied to a fixed conditioning structure and cannot perform new conditional inference once trained. To solve this, we propose a Bayesian generative modeling (BGM) approach for arbitrary conditional inference without retraining. BGM learns a generative model of X through an iterative Bayesian updating algorithm where model parameters and latent variables are updated until convergence. Once trained, any conditional distribution can be obtained without retraining. Empirically, BGM achieves superior prediction performance with well calibrated predictive intervals, demonstrating that a single learned model can serve as a universal engine for conditional prediction with uncertainty quantification. We provide theoretical guarantees for the convergence of the stochastic iterative algorithm, statistical consistency and conditional-risk bounds. The proposed BGM framework leverages the power of AI to capture complex relationships among variables while adhering to Bayesian principles, emerging as a promising framework for advancing various applications in modern data science. The code for BGM is freely available at https://github.com/liuq-lab/bayesgm.

Repos / Data Links

Page Count
43 pages

Category
Statistics:
Machine Learning (Stat)