SciRerankBench: Benchmarking Rerankers Towards Scientific Retrieval-Augmented Generated LLMs
By: Haotian Chen , Qingqing Long , Meng Xiao and more
Potential Business Impact:
Helps computers find correct science answers faster.
Scientific literature question answering is a pivotal step towards new scientific discoveries. Recently, \textit{two-stage} retrieval-augmented generated large language models (RAG-LLMs) have shown impressive advancements in this domain. Such a two-stage framework, especially the second stage (reranker), is particularly essential in the scientific domain, where subtle differences in terminology may have a greatly negative impact on the final factual-oriented or knowledge-intensive answers. Despite this significant progress, the potential and limitations of these works remain unexplored. In this work, we present a Scientific Rerank-oriented RAG Benchmark (SciRerankBench), for evaluating rerankers within RAG-LLMs systems, spanning five scientific subjects. To rigorously assess the reranker performance in terms of noise resilience, relevance disambiguation, and factual consistency, we develop three types of question-context-answer (Q-C-A) pairs, i.e., Noisy Contexts (NC), Semantically Similar but Logically Irrelevant Contexts (SSLI), and Counterfactual Contexts (CC). Through systematic evaluation of 13 widely used rerankers on five families of LLMs, we provide detailed insights into their relative strengths and limitations. To the best of our knowledge, SciRerankBench is the first benchmark specifically developed to evaluate rerankers within RAG-LLMs, which provides valuable observations and guidance for their future development.
Similar Papers
How Good are LLM-based Rerankers? An Empirical Analysis of State-of-the-Art Reranking Models
Computation and Language
Finds better search results for new questions.
A Reasoning-Focused Legal Retrieval Benchmark
Computation and Language
Helps lawyers find important legal information faster.
CoRank: LLM-Based Compact Reranking with Document Features for Scientific Retrieval
Information Retrieval
Finds better science papers faster.