Score: 0

Formalized Hopfield Networks and Boltzmann Machines

Published: December 8, 2025 | arXiv ID: 2512.07766v1

By: Matteo Cipollina, Michail Karatarakis, Freek Wiedijk

Potential Business Impact:

Makes computer brains learn and remember things better.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Neural networks are widely used, yet their analysis and verification remain challenging. In this work, we present a Lean 4 formalization of neural networks, covering both deterministic and stochastic models. We first formalize Hopfield networks, recurrent networks that store patterns as stable states. We prove convergence and the correctness of Hebbian learning, a training rule that updates network parameters to encode patterns, here limited to the case of pairwise-orthogonal patterns. We then consider stochastic networks, where updates are probabilistic and convergence is to a stationary distribution. As a canonical example, we formalize the dynamics of Boltzmann machines and prove their ergodicity, showing convergence to a unique stationary distribution using a new formalization of the Perron-Frobenius theorem.

Page Count
24 pages

Category
Computer Science:
Machine Learning (CS)