Spectral Concentration at the Edge of Stability: Information Geometry of Kernel Associative Memory
By: Akira Tamamori
Potential Business Impact:
Makes computer learning more stable and efficient.
High-capacity kernel Hopfield networks exhibit a "Ridge of Optimization" characterized by extreme stability. While previously linked to "Spectral Concentration," its origin remains elusive. Here, we analyze the network dynamics on a statistical manifold, revealing that the Ridge corresponds to the "Edge of Stability," a critical boundary where the Fisher Information Matrix becomes singular. We demonstrate that the apparent Euclidean force antagonism is a manifestation of \textit{Dual Equilibrium} in the Riemannian space. This unifies learning dynamics and capacity via the Minimum Description Length principle, offering a geometric theory of self-organized criticality.
Similar Papers
Spectral Concentration at the Edge of Stability: Information Geometry of Kernel Associative Memory
Machine Learning (CS)
Makes computer brains learn better and remember more.
Self-Organization and Spectral Mechanism of Attractor Landscapes in High-Capacity Kernel Hopfield Networks
Machine Learning (CS)
Makes computer memory store way more information.
Self-Organization and Spectral Mechanism of Attractor Landscapes in High-Capacity Kernel Hopfield Networks
Machine Learning (CS)
Makes computer memory store way more information.