Score: 0

CausalRAG: Integrating Causal Graphs into Retrieval-Augmented Generation

Published: March 25, 2025 | arXiv ID: 2503.19878v2

By: Nengbo Wang , Xiaotian Han , Jagdip Singh and more

Potential Business Impact:

Helps computers understand information by showing how things connect.

Business Areas:
Semantic Search Internet Services

Large language models (LLMs) have revolutionized natural language processing (NLP), particularly through Retrieval-Augmented Generation (RAG), which enhances LLM capabilities by integrating external knowledge. However, traditional RAG systems face critical limitations, including disrupted contextual integrity due to text chunking, and over-reliance on semantic similarity for retrieval. To address these issues, we propose CausalRAG, a novel framework that incorporates causal graphs into the retrieval process. By constructing and tracing causal relationships, CausalRAG preserves contextual continuity and improves retrieval precision, leading to more accurate and interpretable responses. We evaluate CausalRAG against regular RAG and graph-based RAG approaches, demonstrating its superiority across several metrics. Our findings suggest that grounding retrieval in causal reasoning provides a promising approach to knowledge-intensive tasks.

Country of Origin
🇺🇸 United States

Page Count
13 pages

Category
Computer Science:
Computation and Language