Improving Dense Passage Retrieval with Multiple Positive Passages
By: Shuai Chang
Potential Business Impact:
Finds better answers by using more examples.
By leveraging a dual encoder architecture, Dense Passage Retrieval (DPR) has outperformed traditional sparse retrieval algorithms such as BM25 in terms of passage retrieval accuracy. Recently proposed methods have further enhanced DPR's performance. However, these models typically pair each question with only one positive passage during training, and the effect of associating multiple positive passages has not been examined. In this paper, we explore the performance of DPR when additional positive passages are incorporated during training. Experimental results show that equipping each question with multiple positive passages consistently improves retrieval accuracy, even when using a significantly smaller batch size, which enables training on a single GPU.
Similar Papers
Dense Passage Retrieval in Conversational Search
Information Retrieval
Finds answers in conversations better.
MA-DPR: Manifold-aware Distance Metrics for Dense Passage Retrieval
Information Retrieval
Finds answers even when words don't match.
From Ranking to Selection: A Simple but Efficient Dynamic Passage Selector for Retrieval Augmented Generation
Computation and Language
Helps AI find better answers from many texts.