Symbol-Temporal Consistency Self-supervised Learning for Robust Time Series Classification
By: Kevin Garcia , Cassandra Garza , Brooklyn Berry and more
Potential Business Impact:
Learns health patterns even with messy data.
The surge in the significance of time series in digital health domains necessitates advanced methodologies for extracting meaningful patterns and representations. Self-supervised contrastive learning has emerged as a promising approach for learning directly from raw data. However, time series data in digital health is known to be highly noisy, inherently involves concept drifting, and poses a challenge for training a generalizable deep learning model. In this paper, we specifically focus on data distribution shift caused by different human behaviors and propose a self-supervised learning framework that is aware of the bag-of-symbol representation. The bag-of-symbol representation is known for its insensitivity to data warping, location shifts, and noise existed in time series data, making it potentially pivotal in guiding deep learning to acquire a representation resistant to such data shifting. We demonstrate that the proposed method can achieve significantly better performance where significant data shifting exists.
Similar Papers
Extracting Symbolic Sequences from Visual Representations via Self-Supervised Learning
CV and Pattern Recognition
Teaches computers to understand pictures like words.
Self-Supervised Dynamical System Representations for Physiological Time-Series
Machine Learning (CS)
Helps computers understand body signals better.
SigTime: Learning and Visually Explaining Time Series Signatures
Machine Learning (CS)
Finds hidden patterns in health data.