Score: 1

DYCP: Dynamic Context Pruning for Long-Form Dialogue with LLMs

Published: January 12, 2026 | arXiv ID: 2601.07994v1

By: Nayoung Choi, Jonathan Zhang, Jinho D. Choi

Potential Business Impact:

Makes chatbots remember more and answer faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Large Language Models (LLMs) often exhibit increased response latency and degraded answer quality as dialogue length grows, making effective context management essential. However, existing methods rely on extra LLM calls to build memory or perform offline memory construction without considering the current user utterance, which can introduce inefficiencies or disrupt conversational continuity. We introduce DyCP, a lightweight context management method that dynamically segment and retrieve relevant memory at query time. It preserves the sequential structure of dialogue without predefined topic boundaries and supports efficient, adaptive context retrieval. Across three long-form dialogue benchmarks, LoCoMo, MT-Bench+, and SCM4LLMs, and multiple LLMs, DyCP consistently improves answer quality while reducing response latency. We also examine the gap between modern LLMs' expanded context windows and their actual long-context processing capacity, highlighting the continued importance of effective context management.

Country of Origin
πŸ‡ΊπŸ‡Έ United States


Page Count
14 pages

Category
Computer Science:
Computation and Language