Score: 0

The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM

Published: May 16, 2025 | arXiv ID: 2505.11635v1

By: Nikhil Kapasi, William Whitehead, Luke Theogarajan

Potential Business Impact:

Teaches computers to remember and reason better.

Business Areas:
A/B Testing Data and Analytics

Many real-world tasks, from associative memory to symbolic reasoning, demand discrete, structured representations that standard continuous latent models struggle to express naturally. We introduce the Gaussian-Multinoulli Restricted Boltzmann Machine (GM-RBM), a generative energy-based model that extends the Gaussian-Bernoulli RBM (GB-RBM) by replacing binary hidden units with $q$-state Potts variables. This modification enables a combinatorially richer latent space and supports learning over multivalued, interpretable latent concepts. We formally derive GM-RBM's energy function, learning dynamics, and conditional distributions, showing that it preserves tractable inference and training through contrastive divergence. Empirically, we demonstrate that GM-RBMs model complex multimodal distributions more effectively than binary RBMs, outperforming them on tasks involving analogical recall and structured memory. Our results highlight GM-RBMs as a scalable framework for discrete latent inference with enhanced expressiveness and interoperability.

Country of Origin
🇺🇸 United States

Page Count
11 pages

Category
Computer Science:
Machine Learning (CS)