Score: 1

rETF-semiSL: Semi-Supervised Learning for Neural Collapse in Temporal Data

Published: August 13, 2025 | arXiv ID: 2508.10147v1

By: Yuhan Xie , William Cappelletti , Mahsa Shoaran and more

Potential Business Impact:

Teaches computers to understand time data better.

Deep neural networks for time series must capture complex temporal patterns, to effectively represent dynamic data. Self- and semi-supervised learning methods show promising results in pre-training large models, which -- when finetuned for classification -- often outperform their counterparts trained from scratch. Still, the choice of pretext training tasks is often heuristic and their transferability to downstream classification is not granted, thus we propose a novel semi-supervised pre-training strategy to enforce latent representations that satisfy the Neural Collapse phenomenon observed in optimally trained neural classifiers. We use a rotational equiangular tight frame-classifier and pseudo-labeling to pre-train deep encoders with few labeled samples. Furthermore, to effectively capture temporal dynamics while enforcing embedding separability, we integrate generative pretext tasks with our method, and we define a novel sequential augmentation strategy. We show that our method significantly outperforms previous pretext tasks when applied to LSTMs, transformers, and state-space models on three multivariate time series classification datasets. These results highlight the benefit of aligning pre-training objectives with theoretically grounded embedding geometry.

Country of Origin
🇨🇭 Switzerland

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)