Out-of-Distribution Generalization in Time Series: A Survey
By: Xin Wu , Fei Teng , Xingwang Li and more
Potential Business Impact:
Helps computers learn from changing data better.
Time series frequently manifest distribution shifts, diverse latent features, and non-stationary learning dynamics, particularly in open and evolving environments. These characteristics pose significant challenges for out-of-distribution (OOD) generalization. While substantial progress has been made, a systematic synthesis of advancements remains lacking. To address this gap, we present the first comprehensive review of OOD generalization methodologies for time series, organized to delineate the field's evolutionary trajectory and contemporary research landscape. We organize our analysis across three foundational dimensions: data distribution, representation learning, and OOD evaluation. For each dimension, we present several popular algorithms in detail. Furthermore, we highlight key application scenarios, emphasizing their real-world impact. Finally, we identify persistent challenges and propose future research directions. A detailed summary of the methods reviewed for the generalization of OOD in time series can be accessed at https://tsood-generalization.com.
Similar Papers
Evolving Graph Learning for Out-of-Distribution Generalization in Non-stationary Environments
Machine Learning (CS)
Helps computers learn from changing data better.
Generalized Few-Shot Out-of-Distribution Detection
CV and Pattern Recognition
Helps AI spot fake data better, even with few examples.
Out-of-distribution generalisation is hard: evidence from ARC-like tasks
Machine Learning (CS)
Teaches computers to learn like humans.