KGQuest: Template-Driven QA Generation from Knowledge Graphs with LLM-Based Refinement
By: Sania Nayab , Marco Simoni , Giulio Rossolini and more
Potential Business Impact:
Creates smart questions and answers from facts.
The generation of questions and answers (QA) from knowledge graphs (KG) plays a crucial role in the development and testing of educational platforms, dissemination tools, and large language models (LLM). However, existing approaches often struggle with scalability, linguistic quality, and factual consistency. This paper presents a scalable and deterministic pipeline for generating natural language QA from KGs, with an additional refinement step using LLMs to further enhance linguistic quality. The approach first clusters KG triplets based on their relations, creating reusable templates through natural language rules derived from the entity types of objects and relations. A module then leverages LLMs to refine these templates, improving clarity and coherence while preserving factual accuracy. Finally, the instantiation of answer options is achieved through a selection strategy that introduces distractors from the KG. Our experiments demonstrate that this hybrid approach efficiently generates high-quality QA pairs, combining scalability with fluency and linguistic precision.
Similar Papers
Large Language Models Meet Knowledge Graphs for Question Answering: Synthesis and Opportunities
Computation and Language
Helps computers answer hard questions better.
KBQA-R1: Reinforcing Large Language Models for Knowledge Base Question Answering
Computation and Language
Helps computers answer questions by checking facts.
SocraticKG: Knowledge Graph Construction via QA-Driven Fact Extraction
Computation and Language
Builds smarter knowledge maps from text.