RelDiff: Relational Data Generative Modeling with Graph-Based Diffusion Models
By: Valter Hudovernik , Minkai Xu , Juntong Shi and more
Potential Business Impact:
Creates realistic fake databases that work like real ones.
Real-world databases are predominantly relational, comprising multiple interlinked tables that contain complex structural and statistical dependencies. Learning generative models on relational data has shown great promise in generating synthetic data and imputing missing values. However, existing methods often struggle to capture this complexity, typically reducing relational data to conditionally generated flat tables and imposing limiting structural assumptions. To address these limitations, we introduce RelDiff, a novel diffusion generative model that synthesizes complete relational databases by explicitly modeling their foreign key graph structure. RelDiff combines a joint graph-conditioned diffusion process across all tables for attribute synthesis, and a $2K+$SBM graph generator based on the Stochastic Block Model for structure generation. The decomposition of graph structure and relational attributes ensures both high fidelity and referential integrity, both of which are crucial aspects of synthetic relational database generation. Experiments on 11 benchmark datasets demonstrate that RelDiff consistently outperforms prior methods in producing realistic and coherent synthetic relational databases. Code is available at https://github.com/ValterH/RelDiff.
Similar Papers
Relational Database Distillation: From Structured Tables to Condensed Graph Data
Databases
Shrinks big databases for faster learning.
DiffGraph: Heterogeneous Graph Diffusion Model
Machine Learning (CS)
Cleans messy data for better computer predictions.
Relational Graph Transformer
Machine Learning (CS)
Helps computers learn from connected data better.