Associative Memory and Generative Diffusion in the Zero-noise Limit
By: Joshua Hess, Quaid Morris
Potential Business Impact:
Makes AI remember things better, even with mistakes.
Connections between generative diffusion and continuous-state associative memory models are studied. Morse-Smale dynamical systems are emphasized as universal approximators of gradient-based associative memory models and diffusion models as white-noise perturbed systems thereof. Universal properties of associative memory that follow from this description are described and used to characterize a generic transition from generation to memory as noise levels diminish. Structural stability inherited by Morse-Smale flows is shown to imply a notion of stability for diffusions at vanishing noise levels. Applied to one- and two-parameter families of gradients, this indicates stability at all but isolated points of associative memory learning landscapes and the learning and generation landscapes of diffusion models with gradient drift in the zero-noise limit, at which small sets of generic bifurcations characterize qualitative transitions between stable systems. Examples illustrating the characterization of these landscapes by sequences of these bifurcations are given, along with structural stability criterion for classic and modern Hopfield networks (equivalently, the attention mechanism).
Similar Papers
Memorization to Generalization: Emergence of Diffusion Models from Associative Memory
Machine Learning (CS)
Teaches AI to remember and create new things.
Diffusion models under low-noise regime
CV and Pattern Recognition
Helps AI make better pictures by learning from less data.
Does Generation Require Memorization? Creative Diffusion Models using Ambient Diffusion
Machine Learning (CS)
Makes AI art generators less forgetful, more creative.