CIRAG: Construction-Integration Retrieval and Adaptive Generation for Multi-hop Question Answering
By: Zili Wei , Xiaocui Yang , Yilin Wang and more
Triple-based Iterative Retrieval-Augmented Generation (iRAG) mitigates document-level noise for multi-hop question answering. However, existing methods still face limitations: (i) greedy single-path expansion, which propagates early errors and fails to capture parallel evidence from different reasoning branches, and (ii) granularity-demand mismatch, where a single evidence representation struggles to balance noise control with contextual sufficiency. In this paper, we propose the Construction-Integration Retrieval and Adaptive Generation model, CIRAG. It introduces an Iterative Construction-Integration module that constructs candidate triples and history-conditionally integrates them to distill core triples and generate the next-hop query. This module mitigates the greedy trap by preserving multiple plausible evidence chains. Besides, we propose an Adaptive Cascaded Multi-Granularity Generation module that progressively expands contextual evidence based on the problem requirements, from triples to supporting sentences and full passages. Moreover, we introduce Trajectory Distillation, which distills the teacher model's integration policy into a lightweight student, enabling efficient and reliable long-horizon reasoning. Extensive experiments demonstrate that CIRAG achieves superior performance compared to existing iRAG methods.
Similar Papers
KiRAG: Knowledge-Driven Iterative Retriever for Enhancing Retrieval-Augmented Generation
Computation and Language
Helps computers answer hard questions by finding facts.
HIRAG: Hierarchical-Thought Instruction-Tuning Retrieval-Augmented Generation
Computation and Language
Helps AI think better to answer questions.
Beyond Single Pass, Looping Through Time: KG-IRAG with Iterative Knowledge Retrieval
Artificial Intelligence
Helps computers solve complex problems step-by-step.