Integrating Temporal and Structural Context in Graph Transformers for Relational Deep Learning
By: Divyansha Lachi , Mahmoud Mohammadi , Joe Meyer and more
Potential Business Impact:
Helps computers understand complex relationships over time.
In domains such as healthcare, finance, and e-commerce, the temporal dynamics of relational data emerge from complex interactions-such as those between patients and providers, or users and products across diverse categories. To be broadly useful, models operating on these data must integrate long-range spatial and temporal dependencies across diverse types of entities, while also supporting multiple predictive tasks. However, existing graph models for relational data primarily focus on spatial structure, treating temporal information merely as a filtering constraint to exclude future events rather than a modeling signal, and are typically designed for single-task prediction. To address these gaps, we introduce a temporal subgraph sampler that enhances global context by retrieving nodes beyond the immediate neighborhood to capture temporally relevant relationships. In addition, we propose the Relational Graph Perceiver (RGP), a graph transformer architecture for relational deep learning that leverages a cross-attention-based latent bottleneck to efficiently integrate information from both structural and temporal contexts. This latent bottleneck integrates signals from different node and edge types into a common latent space, enabling the model to build global context across the entire relational system. RGP also incorporates a flexible cross-attention decoder that supports joint learning across tasks with disjoint label spaces within a single model. Experiments on RelBench, SALT, and CTU show that RGP delivers state-of-the-art performance, offering a general and scalable solution for relational deep learning with support for diverse predictive tasks.
Similar Papers
Relational Graph Transformer
Machine Learning (CS)
Helps computers learn from connected data better.
Contextual Graph Transformer: A Small Language Model for Enhanced Engineering Document Information Extraction
Computation and Language
Helps computers understand hard technical writing.
Context Guided Transformer Entropy Modeling for Video Compression
CV and Pattern Recognition
Makes videos smaller and faster to watch.