Score: 0

A Unified Contrastive-Generative Framework for Time Series Classification

Published: August 13, 2025 | arXiv ID: 2508.09451v1

By: Ziyu Liu , Azadeh Alavi , Minyi Li and more

Potential Business Impact:

Teaches computers to understand time patterns better.

Self-supervised learning (SSL) for multivariate time series mainly includes two paradigms: contrastive methods that excel at instance discrimination and generative approaches that model data distributions. While effective individually, their complementary potential remains unexplored. We propose a Contrastive Generative Time series framework (CoGenT), the first framework to unify these paradigms through joint contrastive-generative optimization. CoGenT addresses fundamental limitations of both approaches: it overcomes contrastive learning's sensitivity to high intra-class similarity in temporal data while reducing generative methods' dependence on large datasets. We evaluate CoGenT on six diverse time series datasets. The results show consistent improvements, with up to 59.2% and 14.27% F1 gains over standalone SimCLR and MAE, respectively. Our analysis reveals that the hybrid objective preserves discriminative power while acquiring generative robustness. These findings establish a foundation for hybrid SSL in temporal domains. We will release the code shortly.

Country of Origin
🇦🇺 Australia

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)