Towards a Relationship-Aware Transformer for Tabular Data
By: Andrei V. Konstantinov, Valerii A. Zuev, Lev V. Utkin
Potential Business Impact:
Helps computers learn from related data better.
Deep learning models for tabular data typically do not allow for imposing a graph of external dependencies between samples, which can be useful for accounting for relatedness in tasks such as treatment effect estimation. Graph neural networks only consider adjacent nodes, making them difficult to apply to sparse graphs. This paper proposes several solutions based on a modified attention mechanism, which accounts for possible relationships between data points by adding a term to the attention matrix. Our models are compared with each other and the gradient boosting decision trees in a regression task on synthetic and real-world datasets, as well as in a treatment effect estimation task on the IHDP dataset.
Similar Papers
Scaling Graph Transformers: A Comparative Study of Sparse and Dense Attention
Machine Learning (CS)
Helps computers understand complex connections better.
Tab-PET: Graph-Based Positional Encodings for Tabular Transformers
Machine Learning (CS)
Helps computers learn better from messy data.
Boosting Relational Deep Learning with Pretrained Tabular Models
Databases
Makes computer predictions faster using connected data.