Improving Table Understanding with LLMs and Entity-Oriented Search
By: Thi-Nhung Nguyen , Hoang Ngo , Dinh Phung and more
Potential Business Impact:
Helps computers understand information in tables better.
Our work addresses the challenges of understanding tables. Existing methods often struggle with the unpredictable nature of table content, leading to a reliance on preprocessing and keyword matching. They also face limitations due to the lack of contextual information, which complicates the reasoning processes of large language models (LLMs). To overcome these challenges, we introduce an entity-oriented search method to improve table understanding with LLMs. This approach effectively leverages the semantic similarities between questions and table data, as well as the implicit relationships between table cells, minimizing the need for data preprocessing and keyword matching. Additionally, it focuses on table entities, ensuring that table cells are semantically tightly bound, thereby enhancing contextual clarity. Furthermore, we pioneer the use of a graph query language for table understanding, establishing a new research direction. Experiments show that our approach achieves new state-of-the-art performances on standard benchmarks WikiTableQuestions and TabFact.
Similar Papers
A Hybrid Search for Complex Table Question Answering in Securities Report
Computation and Language
Helps computers understand tables to answer questions.
Planning for Success: Exploring LLM Long-term Planning Capabilities in Table Understanding
Computation and Language
Helps computers understand tables to answer questions.
Table as a Modality for Large Language Models
Computation and Language
Helps computers understand charts and tables better.