Adaptive Hopfield Network: Rethinking Similarities in Associative Memory
By: Shurong Wang , Yuqi Pan , Zhuoyang Shen and more
Potential Business Impact:
Helps computers remember things more correctly.
Associative memory models are content-addressable memory systems fundamental to biological intelligence and are notable for their high interpretability. However, existing models evaluate the quality of retrieval based on proximity, which cannot guarantee that the retrieved pattern has the strongest association with the query, failing correctness. We reframe this problem by proposing that a query is a generative variant of a stored memory pattern, and define a variant distribution to model this subtle context-dependent generative process. Consequently, correct retrieval should return the memory pattern with the maximum a posteriori probability of being the query's origin. This perspective reveals that an ideal similarity measure should approximate the likelihood of each stored pattern generating the query in accordance with variant distribution, which is impossible for fixed and pre-defined similarities used by existing associative memories. To this end, we develop adaptive similarity, a novel mechanism that learns to approximate this insightful but unknown likelihood from samples drawn from context, aiming for correct retrieval. We theoretically prove that our proposed adaptive similarity achieves optimal correct retrieval under three canonical and widely applicable types of variants: noisy, masked, and biased. We integrate this mechanism into a novel adaptive Hopfield network (A-Hop), and empirical results show that it achieves state-of-the-art performance across diverse tasks, including memory retrieval, tabular classification, image classification, and multiple instance learning.
Similar Papers
Collaborative Filtering using Variational Quantum Hopfield Associative Memory
Information Retrieval
Helps movies know what you'll like better.
Dynamic Homophily with Imperfect Recall: Modeling Resilience in Adversarial Networks
Social and Information Networks
Makes computer networks stronger by forgetting things.
Hopfield Networks Meet Big Data: A Brain-Inspired Deep Learning Framework for Semantic Data Linking
Machine Learning (CS)
Connects different data so computers understand it.