Structural Deep Encoding for Table Question Answering
By: Raphaël Mouravieff, Benjamin Piwowarski, Sylvain Lamprier
Potential Business Impact:
Helps computers understand charts and tables better.
Although Transformers-based architectures excel at processing textual information, their naive adaptation for tabular data often involves flattening the table structure. This simplification can lead to the loss of essential inter-dependencies between rows, columns, and cells, while also posing scalability challenges for large tables. To address these issues, prior works have explored special tokens, structured embeddings, and sparse attention patterns. In this paper, we conduct a comprehensive analysis of tabular encoding techniques, which highlights the crucial role of attention sparsity in preserving structural information of tables. We also introduce a set of novel sparse attention mask designs for tabular data, that not only enhance computational efficiency but also preserve structural integrity, leading to better overall performance.
Similar Papers
TABLET: Table Structure Recognition using Encoder-only Transformers
CV and Pattern Recognition
Helps computers understand messy tables faster.
Basis Transformers for Multi-Task Tabular Regression
Machine Learning (CS)
Helps computers understand messy data better.
Datum-wise Transformer for Synthetic Tabular Data Detection in the Wild
Machine Learning (CS)
Finds fake computer-made tables of information.