Score: 1

SYNAPSE: Empowering LLM Agents with Episodic-Semantic Memory via Spreading Activation

Published: January 6, 2026 | arXiv ID: 2601.02744v1

By: Hanqi Jiang , Junhao Chen , Yi Pan and more

Potential Business Impact:

Helps AI remember and connect information better.

Business Areas:
Semantic Search Internet Services

While Large Language Models (LLMs) excel at generalized reasoning, standard retrieval-augmented approaches fail to address the disconnected nature of long-term agentic memory. To bridge this gap, we introduce Synapse (Synergistic Associative Processing Semantic Encoding), a unified memory architecture that transcends static vector similarity. Drawing from cognitive science, Synapse models memory as a dynamic graph where relevance emerges from spreading activation rather than pre-computed links. By integrating lateral inhibition and temporal decay, the system dynamically highlights relevant sub-graphs while filtering interference. We implement a Triple Hybrid Retrieval strategy that fuses geometric embeddings with activation-based graph traversal. Comprehensive evaluations on the LoCoMo benchmark show that Synapse significantly outperforms state-of-the-art methods in complex temporal and multi-hop reasoning tasks, offering a robust solution to the "Contextual Tunneling" problem. Our code and data will be made publicly available upon acceptance.

Page Count
17 pages

Category
Computer Science:
Computation and Language