Correspondence Between Ising Machines and Neural Networks
By: Andrew G. Moore
Potential Business Impact:
Computers can now learn faster at any temperature.
Computation with the Ising model is central to future computing technologies like quantum annealing, adiabatic quantum computing, and thermodynamic classical computing. Traditionally, computed values have been equated with ground states. This paper generalizes computation with ground states to computation with spin averages, allowing computations to take place at high temperatures. It then introduces a systematic correspondence between Ising devices and neural networks and a simple method to run trained feed-forward neural networks on Ising-type hardware. Finally, a mathematical proof is offered that these implementations are always successful.
Similar Papers
Geometric Theory of Ising Machines
Emerging Technologies
Maps computer problems to make them easier.
Optimized Machine Learning Methods for Studying the Thermodynamic Behavior of Complex Spin Systems
Computational Physics
Finds hidden patterns in magnets using smart computer eyes.
Probabilistic Computing Optimization of Complex Spin-Glass Topologies
Disordered Systems and Neural Networks
Solves hard puzzles faster using special computer bits.