Lagrange Oscillatory Neural Networks for Constraint Satisfaction and Optimization
By: Corentin Delacour , Bram Haverkort , Filip Sabo and more
Potential Business Impact:
Helps computers solve hard problems by avoiding wrong answers.
Physics-inspired computing paradigms are receiving renewed attention to enhance efficiency in compute-intensive tasks such as artificial intelligence and optimization. Similar to Hopfield neural networks, oscillatory neural networks (ONNs) minimize an Ising energy function that embeds the solutions of hard combinatorial optimization problems. Despite their success in solving unconstrained optimization problems, Ising machines still face challenges with constrained problems as they can get stuck at infeasible local minima. In this paper, we introduce a Lagrange ONN (LagONN) designed to escape infeasible states based on the theory of Lagrange multipliers. Unlike existing oscillatory Ising machines, LagONN employs additional Lagrange oscillators to guide the system towards feasible states in an augmented energy landscape and settles only when constraints are met. Taking the maximum satisfiability problem with three literals as a use case (Max-3-SAT), we harness LagONN's constraint satisfaction mechanism to find optimal solutions for random SATlib instances with up to 200 variables and 860 clauses, which provides a deterministic alternative to simulated annealing for coupled oscillators. We further discuss the potential of Lagrange oscillators to address other constraints, such as phase copying, which is useful in oscillatory Ising machines with limited connectivity.
Similar Papers
Solving Sudoku Using Oscillatory Neural Networks
Disordered Systems and Neural Networks
Solves Sudoku puzzles faster using brain-like connections.
LaPON: A Lagrange's-mean-value-theorem-inspired operator network for solving PDEs and its application on NSE
Computational Physics
Solves hard science problems faster and more accurately.
Overcoming Quadratic Hardware Scaling for a Fully Connected Digital Oscillatory Neural Network
Hardware Architecture
Builds bigger computer brains that learn faster.