Score: 1

Structural Deep Encoding for Table Question Answering

Published: March 3, 2025 | arXiv ID: 2503.01457v1

By: Raphaël Mouravieff, Benjamin Piwowarski, Sylvain Lamprier

Potential Business Impact:

Helps computers understand charts and tables better.

Business Areas:
Text Analytics Data and Analytics, Software

Although Transformers-based architectures excel at processing textual information, their naive adaptation for tabular data often involves flattening the table structure. This simplification can lead to the loss of essential inter-dependencies between rows, columns, and cells, while also posing scalability challenges for large tables. To address these issues, prior works have explored special tokens, structured embeddings, and sparse attention patterns. In this paper, we conduct a comprehensive analysis of tabular encoding techniques, which highlights the crucial role of attention sparsity in preserving structural information of tables. We also introduce a set of novel sparse attention mask designs for tabular data, that not only enhance computational efficiency but also preserve structural integrity, leading to better overall performance.

Repos / Data Links

Page Count
14 pages

Category
Computer Science:
Computation and Language