Learning Contextual Retrieval for Robust Conversational Search
By: Seunghan Yang , Juntae Lee , Jihwan Bang and more
Potential Business Impact:
Helps search engines remember what you asked before.
Effective conversational search demands a deep understanding of user intent across multiple dialogue turns. Users frequently use abbreviations and shift topics in the middle of conversations, posing challenges for conventional retrievers. While query rewriting techniques improve clarity, they often incur significant computational cost due to additional autoregressive steps. Moreover, although LLM-based retrievers demonstrate strong performance, they are not explicitly optimized to track user intent in multi-turn settings, often failing under topic drift or contextual ambiguity. To address these limitations, we propose ContextualRetriever, a novel LLM-based retriever that directly incorporates conversational context into the retrieval process. Our approach introduces: (1) a context-aware embedding mechanism that highlights the current query within the dialogue history; (2) intent-guided supervision based on high-quality rewritten queries; and (3) a training strategy that preserves the generative capabilities of the base LLM. Extensive evaluations across multiple conversational search benchmarks demonstrate that ContextualRetriever significantly outperforms existing methods while incurring no additional inference overhead.
Similar Papers
Retrieval Augmented Generation based context discovery for ASR
Computation and Language
Makes voice recorders understand tricky words better.
Benchmarking Contextual Understanding for In-Car Conversational Systems
Computation and Language
Tests car voice assistants for better answers.
Efficient Conversational Search via Topical Locality in Dense Retrieval
Information Retrieval
Makes online searches faster by guessing what you'll ask next.