Score: 2

Beyond Chunks and Graphs: Retrieval-Augmented Generation through Triplet-Driven Thinking

Published: August 4, 2025 | arXiv ID: 2508.02435v1

By: Shengbo Gong , Xianfeng Tang , Carl Yang and more

BigTech Affiliations: Amazon

Potential Business Impact:

Makes AI answer questions more accurately and cheaply.

Retrieval-augmented generation (RAG) is critical for reducing hallucinations and incorporating external knowledge into Large Language Models (LLMs). However, advanced RAG systems face a trade-off between performance and efficiency. Multi-round RAG approaches achieve strong reasoning but incur excessive LLM calls and token costs, while Graph RAG methods suffer from computationally expensive, error-prone graph construction and retrieval redundancy. To address these challenges, we propose T$^2$RAG, a novel framework that operates on a simple, graph-free knowledge base of atomic triplets. T$^2$RAG leverages an LLM to decompose questions into searchable triplets with placeholders, which it then iteratively resolves by retrieving evidence from the triplet database. Empirical results show that T$^2$RAG significantly outperforms state-of-the-art multi-round and Graph RAG methods, achieving an average performance gain of up to 11\% across six datasets while reducing retrieval costs by up to 45\%. Our code is available at https://github.com/rockcor/T2RAG

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
19 pages

Category
Computer Science:
Information Retrieval