Quantitative Attractor Analysis of High-Capacity Kernel Logistic Regression Hopfield Networks
By: Akira Tamamori
Potential Business Impact:
Stores way more memories without mistakes.
Traditional Hopfield networks, using Hebbian learning, face severe storage capacity limits ($\approx 0.14$ P/N) and spurious attractors. Kernel Logistic Regression (KLR) offers a non-linear approach, mapping patterns to high-dimensional feature spaces for improved separability. Our previous work showed KLR dramatically improves capacity and noise robustness over conventional methods. This paper quantitatively analyzes the attractor structures in KLR-trained networks via extensive simulations. We evaluated recall from diverse initial states across wide storage loads (up to 4.0 P/N) and noise levels. We quantified convergence rates and speed. Our analysis confirms KLR's superior performance: high capacity (up to 4.0 P/N) and robustness. The attractor landscape is remarkably "clean," with near-zero spurious fixed points. Recall failures under high load/noise are primarily due to convergence to other learned patterns, not spurious ones. Dynamics are exceptionally fast (typically 1-2 steps for high-similarity states). This characterization reveals how KLR reshapes dynamics for high-capacity associative memory, highlighting its effectiveness and contributing to AM understanding.
Similar Papers
Kernel Logistic Regression Learning for High-Capacity Hopfield Networks
Machine Learning (CS)
Stores way more memories in computer brains.
Kernel Ridge Regression for Efficient Learning of High-Capacity Hopfield Networks
Machine Learning (CS)
Makes computer memories store more, faster.
Self-Organization of Attractor Landscapes in High-Capacity Kernel Logistic Regression Hopfield Networks
Machine Learning (CS)
Makes computer memory store much more information.