Formalized Hopfield Networks and Boltzmann Machines
By: Matteo Cipollina, Michail Karatarakis, Freek Wiedijk
Potential Business Impact:
Makes computer brains learn and remember things better.
Neural networks are widely used, yet their analysis and verification remain challenging. In this work, we present a Lean 4 formalization of neural networks, covering both deterministic and stochastic models. We first formalize Hopfield networks, recurrent networks that store patterns as stable states. We prove convergence and the correctness of Hebbian learning, a training rule that updates network parameters to encode patterns, here limited to the case of pairwise-orthogonal patterns. We then consider stochastic networks, where updates are probabilistic and convergence is to a stationary distribution. As a canonical example, we formalize the dynamics of Boltzmann machines and prove their ergodicity, showing convergence to a unique stationary distribution using a new formalization of the Perron-Frobenius theorem.
Similar Papers
Towards agent-based-model informed neural networks
Machine Learning (CS)
Teaches computers to understand how groups act.
Bio-Inspired Plastic Neural Networks for Zero-Shot Out-of-Distribution Generalization in Complex Animal-Inspired Robots
Robotics
Robots learn to walk on tricky ground.
Quantum Mechanics and Neural Networks
High Energy Physics - Theory
Makes quantum physics work like computer programs.