REFINESTAT: Efficient Exploration for Probabilistic Program Synthesis
By: Madhav Kanda, Shubham Ugare, Sasa Misailovic
Potential Business Impact:
Makes computer programs understand uncertainty better.
Probabilistic programming offers a powerful framework for modeling uncertainty, yet statistical model discovery in this domain entails navigating an immense search space under strict domain-specific constraints. When small language models are tasked with generating probabilistic programs, they frequently produce outputs that suffer from both syntactic and semantic errors, such as flawed inference constructs. Motivated by probabilistic programmers' domain expertise and debugging strategies, we introduce RefineStat, a language model--driven framework that enforces semantic constraints ensuring synthesized programs contain valid distributions and well-formed parameters, and then applies diagnostic-aware refinement by resampling prior or likelihood components whenever reliability checks fail. We evaluate RefineStat on multiple probabilistic-programming code-generation tasks using smaller language models (SLMs) and find that it produces programs that are both syntactically sound and statistically reliable, often matching or surpassing those from closed-source large language models (e.g., OpenAI o3).
Similar Papers
Sound Interval-Based Synthesis for Probabilistic Programs
Programming Languages
Helps scientists find answers without needing math skills.
Probabilistic Programming with Sufficient Statistics for faster Bayesian Computation
Computation
Makes computer models run much faster.
Structural Abstraction and Refinement for Probabilistic Programs
Formal Languages and Automata Theory
Checks if tricky computer programs are fair.