Score: 0

Maxitive Donsker-Varadhan Formulation for Possibilistic Variational Inference

Published: November 26, 2025 | arXiv ID: 2511.21223v1

By: Jasraj Singh , Shelvia Wongso , Jeremie Houssineau and more

Potential Business Impact:

Lets computers learn better with less information.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Variational inference (VI) is a cornerstone of modern Bayesian learning, enabling approximate inference in complex models that would otherwise be intractable. However, its formulation depends on expectations and divergences defined through high-dimensional integrals, often rendering analytical treatment impossible and necessitating heavy reliance on approximate learning and inference techniques. Possibility theory, an imprecise probability framework, allows to directly model epistemic uncertainty instead of leveraging subjective probabilities. While this framework provides robustness and interpretability under sparse or imprecise information, adapting VI to the possibilistic setting requires rethinking core concepts such as entropy and divergence, which presuppose additivity. In this work, we develop a principled formulation of possibilistic variational inference and apply it to a special class of exponential-family functions, highlighting parallels with their probabilistic counterparts and revealing the distinctive mathematical structures of possibility theory.

Page Count
17 pages

Category
Statistics:
Machine Learning (Stat)