SEDM: Scalable Self-Evolving Distributed Memory for Agents
By: Haoran Xu , Jiacong Hu , Ke Zhang and more
Potential Business Impact:
Makes AI remember better, learn faster, and share knowledge.
Long-term multi-agent systems inevitably generate vast amounts of trajectories and historical interactions, which makes efficient memory management essential for both performance and scalability. Existing methods typically depend on vector retrieval and hierarchical storage, yet they are prone to noise accumulation, uncontrolled memory expansion, and limited generalization across domains. To address these challenges, we present SEDM, Self-Evolving Distributed Memory, a verifiable and adaptive framework that transforms memory from a passive repository into an active, self-optimizing component. SEDM integrates verifiable write admission based on reproducible replay, a self-scheduling memory controller that dynamically ranks and consolidates entries according to empirical utility, and cross-domain knowledge diffusion that abstracts reusable insights to support transfer across heterogeneous tasks. Evaluations on benchmark datasets demonstrate that SEDM improves reasoning accuracy while reducing token overhead compared with strong memory baselines, and further enables knowledge distilled from fact verification to enhance multi-hop reasoning. The results highlight SEDM as a scalable and sustainable memory mechanism for open-ended multi-agent collaboration. The code will be released in the later stage of this project.
Similar Papers
SEDM: Scalable Self-Evolving Distributed Memory for Agents
Artificial Intelligence
Makes AI remember better and learn faster.
Distributed Dynamic Associative Memory via Online Convex Optimization
Machine Learning (CS)
Helps many computers learn together faster.
Memoria: A Scalable Agentic Memory Framework for Personalized Conversational AI
Artificial Intelligence
Helps AI remember you and talk better.