Think Parallax: Solving Multi-Hop Problems via Multi-View Knowledge-Graph-Based Retrieval-Augmented Generation
By: Jinliang Liu
Potential Business Impact:
Helps AI answer questions by finding better facts.
Large language models (LLMs) excel at language understanding but often hallucinate and struggle with multi-hop reasoning. Knowledge-graph-based retrieval-augmented generation (KG-RAG) offers grounding, yet most methods rely on flat embeddings and noisy path exploration. We propose ParallaxRAG, a framework that symmetrically decouples queries and graph triples into multi-view spaces, enabling a robust retrieval architecture that explicitly enforces head diversity while constraining weakly related paths. Central to our approach is the observation that different attention heads specialize in semantic relations at distinct reasoning stages, contributing to different hops of the reasoning chain. This specialization allows ParallaxRAG to construct cleaner subgraphs and guide LLMs through grounded, step-wise reasoning. Experiments on WebQSP and CWQ, under our unified, reproducible setup (BGE-M3 + Llama3.1-8B), demonstrate competitive retrieval and QA performance, alongside reduced hallucination and good generalization. Our results highlight multi-view head specialization as a principled direction for knowledge-grounded multi-hop reasoning. Our implementation will be released as soon as the paper is accepted.
Similar Papers
ProgRAG: Hallucination-Resistant Progressive Retrieval and Reasoning over Knowledge Graphs
Artificial Intelligence
Helps computers answer hard questions using smart steps.
GRIL: Knowledge Graph Retrieval-Integrated Learning with Large Language Models
Machine Learning (CS)
Helps AI answer questions by learning from connected facts.
Human Cognition Inspired RAG with Knowledge Graph for Complex Problem Solving
Machine Learning (CS)
Helps computers solve hard problems by thinking step-by-step.