Dense associative memory on the Bures-Wasserstein space
By: Chandan Tankala, Krishnakumar Balasubramanian
Potential Business Impact:
Stores and finds information as complex ideas, not just simple lists.
Dense associative memories (DAMs) store and retrieve patterns via energy-functional fixed points, but existing models are limited to vector representations. We extend DAMs to probability distributions equipped with the 2-Wasserstein distance, focusing mainly on the Bures-Wasserstein class of Gaussian densities. Our framework defines a log-sum-exp energy over stored distributions and a retrieval dynamics aggregating optimal transport maps in a Gibbs-weighted manner. Stationary points correspond to self-consistent Wasserstein barycenters, generalizing classical DAM fixed points. We prove exponential storage capacity, provide quantitative retrieval guarantees under Wasserstein perturbations, and validate the model on synthetic and real-world distributional tasks. This work elevates associative memory from vectors to full distributions, bridging classical DAMs with modern generative modeling and enabling distributional storage and retrieval in memory-augmented learning.
Similar Papers
Saddle Hierarchy in Dense Associative Memory
Machine Learning (CS)
Makes AI learn faster and use less power.
Dense Associative Memory with Epanechnikov Energy
Machine Learning (CS)
Stores more memories, finds new ideas.
Distributed Dynamic Associative Memory via Online Convex Optimization
Machine Learning (CS)
Helps many computers learn together faster.