Score: 1

Diffusion Transformers for Imputation: Statistical Efficiency and Uncertainty Quantification

Published: October 2, 2025 | arXiv ID: 2510.02216v1

By: Zeqi Ye, Minshuo Chen

Potential Business Impact:

Fixes missing data in charts and graphs.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Imputation methods play a critical role in enhancing the quality of practical time-series data, which often suffer from pervasive missing values. Recently, diffusion-based generative imputation methods have demonstrated remarkable success compared to autoregressive and conventional statistical approaches. Despite their empirical success, the theoretical understanding of how well diffusion-based models capture complex spatial and temporal dependencies between the missing values and observed ones remains limited. Our work addresses this gap by investigating the statistical efficiency of conditional diffusion transformers for imputation and quantifying the uncertainty in missing values. Specifically, we derive statistical sample complexity bounds based on a novel approximation theory for conditional score functions using transformers, and, through this, construct tight confidence regions for missing values. Our findings also reveal that the efficiency and accuracy of imputation are significantly influenced by the missing patterns. Furthermore, we validate these theoretical insights through simulation and propose a mixed-masking training strategy to enhance the imputation performance.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
49 pages

Category
Computer Science:
Machine Learning (CS)