Structured Episodic Event Memory
By: Zhengxuan Lu , Dongfang Li , Yukun Shi and more
Potential Business Impact:
Helps AI remember stories to think better.
Current approaches to memory in Large Language Models (LLMs) predominantly rely on static Retrieval-Augmented Generation (RAG), which often results in scattered retrieval and fails to capture the structural dependencies required for complex reasoning. For autonomous agents, these passive and flat architectures lack the cognitive organization necessary to model the dynamic and associative nature of long-term interaction. To address this, we propose Structured Episodic Event Memory (SEEM), a hierarchical framework that synergizes a graph memory layer for relational facts with a dynamic episodic memory layer for narrative progression. Grounded in cognitive frame theory, SEEM transforms interaction streams into structured Episodic Event Frames (EEFs) anchored by precise provenance pointers. Furthermore, we introduce an agentic associative fusion and Reverse Provenance Expansion (RPE) mechanism to reconstruct coherent narrative contexts from fragmented evidence. Experimental results on the LoCoMo and LongMemEval benchmarks demonstrate that SEEM significantly outperforms baselines, enabling agents to maintain superior narrative coherence and logical consistency.
Similar Papers
Memory Matters More: Event-Centric Memory as a Logic Map for Agent Searching and Reasoning
Artificial Intelligence
Helps AI remember and use past events better.
Episodic Memories Generation and Evaluation Benchmark for Large Language Models
Computation and Language
Helps computers remember past events like people.
MemRL: Self-Evolving Agents via Runtime Reinforcement Learning on Episodic Memory
Computation and Language
Teaches computers to learn new things like people.