Score: 0

Quantitative Attractor Analysis of High-Capacity Kernel Logistic Regression Hopfield Networks

Published: May 2, 2025 | arXiv ID: 2505.01218v1

By: Akira Tamamori

Potential Business Impact:

Stores way more memories without mistakes.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Traditional Hopfield networks, using Hebbian learning, face severe storage capacity limits ($\approx 0.14$ P/N) and spurious attractors. Kernel Logistic Regression (KLR) offers a non-linear approach, mapping patterns to high-dimensional feature spaces for improved separability. Our previous work showed KLR dramatically improves capacity and noise robustness over conventional methods. This paper quantitatively analyzes the attractor structures in KLR-trained networks via extensive simulations. We evaluated recall from diverse initial states across wide storage loads (up to 4.0 P/N) and noise levels. We quantified convergence rates and speed. Our analysis confirms KLR's superior performance: high capacity (up to 4.0 P/N) and robustness. The attractor landscape is remarkably "clean," with near-zero spurious fixed points. Recall failures under high load/noise are primarily due to convergence to other learned patterns, not spurious ones. Dynamics are exceptionally fast (typically 1-2 steps for high-similarity states). This characterization reveals how KLR reshapes dynamics for high-capacity associative memory, highlighting its effectiveness and contributing to AM understanding.

Page Count
8 pages

Category
Computer Science:
Machine Learning (CS)