Score: 0

Temporal Variational Implicit Neural Representations

Published: June 2, 2025 | arXiv ID: 2506.01544v1

By: Batuhan Koyuncu , Rachael DeVries , Ole Winther and more

Potential Business Impact:

Predicts missing data in messy timelines instantly.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

We introduce Temporal Variational Implicit Neural Representations (TV-INRs), a probabilistic framework for modeling irregular multivariate time series that enables efficient individualized imputation and forecasting. By integrating implicit neural representations with latent variable models, TV-INRs learn distributions over time-continuous generator functions conditioned on signal-specific covariates. Unlike existing approaches that require extensive training, fine-tuning or meta-learning, our method achieves accurate individualized predictions through a single forward pass. Our experiments demonstrate that with a single TV-INRs instance, we can accurately solve diverse imputation and forecasting tasks, offering a computationally efficient and scalable solution for real-world applications. TV-INRs excel especially in low-data regimes, where it outperforms existing methods by an order of magnitude in mean squared error for imputation task.

Country of Origin
🇩🇪 Germany

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)