Kernel Logistic Regression Learning for High-Capacity Hopfield Networks
By: Akira Tamamori
Potential Business Impact:
Stores way more memories in computer brains.
Hebbian learning limits Hopfield network storage capacity (pattern-to-neuron ratio around 0.14). We propose Kernel Logistic Regression (KLR) learning. Unlike linear methods, KLR uses kernels to implicitly map patterns to high-dimensional feature space, enhancing separability. By learning dual variables, KLR dramatically improves storage capacity, achieving perfect recall even when pattern numbers exceed neuron numbers (up to ratio 1.5 shown), and enhances noise robustness. KLR demonstrably outperforms Hebbian and linear logistic regression approaches.
Similar Papers
Kernel Ridge Regression for Efficient Learning of High-Capacity Hopfield Networks
Machine Learning (CS)
Makes computer memories store more, faster.
Quantitative Attractor Analysis of High-Capacity Kernel Logistic Regression Hopfield Networks
Machine Learning (CS)
Stores way more memories without mistakes.
Self-Organization of Attractor Landscapes in High-Capacity Kernel Logistic Regression Hopfield Networks
Machine Learning (CS)
Makes computer memory store much more information.