The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM
By: Nikhil Kapasi, William Whitehead, Luke Theogarajan
Potential Business Impact:
Teaches computers to remember and reason better.
Many real-world tasks, from associative memory to symbolic reasoning, demand discrete, structured representations that standard continuous latent models struggle to express naturally. We introduce the Gaussian-Multinoulli Restricted Boltzmann Machine (GM-RBM), a generative energy-based model that extends the Gaussian-Bernoulli RBM (GB-RBM) by replacing binary hidden units with $q$-state Potts variables. This modification enables a combinatorially richer latent space and supports learning over multivalued, interpretable latent concepts. We formally derive GM-RBM's energy function, learning dynamics, and conditional distributions, showing that it preserves tractable inference and training through contrastive divergence. Empirically, we demonstrate that GM-RBMs model complex multimodal distributions more effectively than binary RBMs, outperforming them on tasks involving analogical recall and structured memory. Our results highlight GM-RBMs as a scalable framework for discrete latent inference with enhanced expressiveness and interoperability.
Similar Papers
On the role of non-linear latent features in bipartite generative neural networks
Disordered Systems and Neural Networks
Improves computer memory recall by changing how it learns.
Photonic restricted Boltzmann machine for content generation tasks
Optics
Makes AI create pictures and sounds faster.
Generative modeling using evolved quantum Boltzmann machines
Quantum Physics
Teaches quantum computers to learn hard patterns.