Score: 1

Approximate Bayesian Inference via Bitstring Representations

Published: August 19, 2025 | arXiv ID: 2508.13598v1

By: Aleksanteri Sladek, Martin Trapp, Arno Solin

Potential Business Impact:

Teaches computers to learn from less data.

Business Areas:
Quantum Computing Science and Engineering

The machine learning community has recently put effort into quantized or low-precision arithmetics to scale large models. This paper proposes performing probabilistic inference in the quantized, discrete parameter space created by these representations, effectively enabling us to learn a continuous distribution using discrete parameters. We consider both 2D densities and quantized neural networks, where we introduce a tractable learning approach using probabilistic circuits. This method offers a scalable solution to manage complex distributions and provides clear insights into model behavior. We validate our approach with various models, demonstrating inference efficiency without sacrificing accuracy. This work advances scalable, interpretable machine learning by utilizing discrete approximations for probabilistic computations.

Repos / Data Links

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)