CausalRAG: Integrating Causal Graphs into Retrieval-Augmented Generation
By: Nengbo Wang , Xiaotian Han , Jagdip Singh and more
Potential Business Impact:
Helps computers understand information by showing how things connect.
Large language models (LLMs) have revolutionized natural language processing (NLP), particularly through Retrieval-Augmented Generation (RAG), which enhances LLM capabilities by integrating external knowledge. However, traditional RAG systems face critical limitations, including disrupted contextual integrity due to text chunking, and over-reliance on semantic similarity for retrieval. To address these issues, we propose CausalRAG, a novel framework that incorporates causal graphs into the retrieval process. By constructing and tracing causal relationships, CausalRAG preserves contextual continuity and improves retrieval precision, leading to more accurate and interpretable responses. We evaluate CausalRAG against regular RAG and graph-based RAG approaches, demonstrating its superiority across several metrics. Our findings suggest that grounding retrieval in causal reasoning provides a promising approach to knowledge-intensive tasks.
Similar Papers
Causal-Counterfactual RAG: The Integration of Causal-Counterfactual Reasoning into RAG
Computation and Language
Makes AI understand "why" things happen, not just "what."
Causal-Counterfactual RAG: The Integration of Causal-Counterfactual Reasoning into RAG
Computation and Language
Helps AI understand why things happen, not just what.
A Survey of Graph Retrieval-Augmented Generation for Customized Large Language Models
Computation and Language
Helps computers understand complex topics better.