Improving Table Retrieval with Question Generation from Partial Tables
By: Hsing-Ping Liang, Che-Wei Chang, Yao-Chung Fan
Potential Business Impact:
Helps computers find answers in tables faster.
Recent advances in open-domain question answering over tables have widely adopted large language models (LLMs) under the Retriever-Reader architecture. Prior works have effectively leveraged LLMs to tackle the complex reasoning demands of the Reader component, such as text-to-text, text-to-SQL, and multi hop reasoning. In contrast, the Retriever component has primarily focused on optimizing the query representation-training retrievers to retrieve relevant tables based on questions, or to select keywords from questions for matching table segments. However, little attention has been given to enhancing how tables themselves are represented in embedding space to better align with questions. To address this, we propose QGpT (Question Generation from Partial Tables), a simple yet effective method that uses an LLM to generate synthetic questions based on small portions of a table. These questions are generated to simulate how a user might query the content of the table currently under consideration. The generated questions are then jointly embedded with the partial table segments used for generation, enhancing semantic alignment with user queries. Without the need to embed entire tables, our method significantly improves retrieval performance across multiple benchmarks for both dense and late-interaction retrievers.
Similar Papers
Agentic LLMs for Question Answering over Tabular Data
Computation and Language
Answers questions from complex tables using smart computer language.
A Hybrid Search for Complex Table Question Answering in Securities Report
Computation and Language
Helps computers understand tables to answer questions.
Hybrid Graphs for Table-and-Text based Question Answering using LLMs
Computation and Language
Helps computers answer questions from text and tables.