Score: 3

ZeroGR: A Generalizable and Scalable Framework for Zero-Shot Generative Retrieval

Published: October 12, 2025 | arXiv ID: 2510.10419v1

By: Weiwei Sun , Keyi Kong , Xinyu Ma and more

Potential Business Impact:

Helps computers find information without prior training.

Business Areas:
Semantic Search Internet Services

Generative retrieval (GR) reformulates information retrieval (IR) by framing it as the generation of document identifiers (docids), thereby enabling an end-to-end optimization and seamless integration with generative language models (LMs). Despite notable progress under supervised training, GR still struggles to generalize to zero-shot IR scenarios, which are prevalent in real-world applications. To tackle this challenge, we propose \textsc{ZeroGR}, a zero-shot generative retrieval framework that leverages natural language instructions to extend GR across a wide range of IR tasks. Specifically, \textsc{ZeroGR} is composed of three key components: (i) an LM-based docid generator that unifies heterogeneous documents (e.g., text, tables, code) into semantically meaningful docids; (ii) an instruction-tuned query generator that generates diverse types of queries from natural language task descriptions to enhance corpus indexing; and (iii) a reverse annealing decoding strategy to balance precision and recall during docid generation. We investigate the impact of instruction fine-tuning scale and find that performance consistently improves as the number of IR tasks encountered during training increases. Empirical results on the BEIR and MAIR benchmarks demonstrate that \textsc{ZeroGR} outperforms strong dense retrieval and generative baselines in zero-shot settings, establishing a new state-of-the-art for instruction-driven GR.

Country of Origin
πŸ‡³πŸ‡± πŸ‡ΊπŸ‡Έ Netherlands, United States

Repos / Data Links

Page Count
18 pages

Category
Computer Science:
Information Retrieval