Score: 0

Tabular Data Understanding with LLMs: A Survey of Recent Advances and Challenges

Published: July 31, 2025 | arXiv ID: 2508.00217v1

By: Xiaofeng Wu, Alan Ritter, Wei Xu

Potential Business Impact:

Helps computers understand all kinds of tables.

Tables have gained significant attention in large language models (LLMs) and multimodal large language models (MLLMs) due to their complex and flexible structure. Unlike linear text inputs, tables are two-dimensional, encompassing formats that range from well-structured database tables to complex, multi-layered spreadsheets, each with different purposes. This diversity in format and purpose has led to the development of specialized methods and tasks, instead of universal approaches, making navigation of table understanding tasks challenging. To address these challenges, this paper introduces key concepts through a taxonomy of tabular input representations and an introduction of table understanding tasks. We highlight several critical gaps in the field that indicate the need for further research: (1) the predominance of retrieval-focused tasks that require minimal reasoning beyond mathematical and logical operations; (2) significant challenges faced by models when processing complex table structures, large-scale tables, length context, or multi-table scenarios; and (3) the limited generalization of models across different tabular representations and formats.

Country of Origin
🇺🇸 United States

Page Count
15 pages

Category
Computer Science:
Computation and Language