Score: 0

Gradient-bridged Posterior: Bayesian Inference for Models with Implicit Functions

Published: March 14, 2025 | arXiv ID: 2503.11637v1

By: Cheng Zeng , Yaozhi Yang , Jason Xu and more

Potential Business Impact:

Makes complex math problems easier for computers.

Business Areas:
A/B Testing Data and Analytics

Many statistical problems include model parameters that are defined as the solutions to optimization sub-problems. These include classical approaches such as profile likelihood as well as modern applications involving flow networks or Procrustes distances. In such cases, the likelihood of the data involves an implicit function, often complicating inferential procedures and entailing prohibitive computational cost. In this article, we propose an intuitive and tractable posterior inference approach for this setting. We introduce a class of continuous models that handle implicit function values using the first-order optimality of the sub-problems. Specifically, we apply a shrinkage kernel to the gradient norm, which retains a probabilistic interpretation within a generative model. This can be understood as a generalization of the Gibbs posterior framework to newly enable concentration around partial minimizers in a subset of the parameters. We show that this method, termed the gradient-bridged posterior, is amenable to efficient posterior computation, and enjoys theoretical guarantees, establishing a Bernstein--von Mises theorem for asymptotic normality. The advantages of our approach are highlighted on a synthetic flow network experiment and an application to data integration using Procrustes distances.

Country of Origin
🇺🇸 United States

Page Count
31 pages

Category
Statistics:
Methodology