Hi-WaveTST: A Hybrid High-Frequency Wavelet-Transformer for Time-Series Classification
By: Huseyin Goksu
Potential Business Impact:
Finds hidden patterns in time data for better predictions.
Transformers have become state-of-the-art (SOTA) for time-series classification, with models like PatchTST demonstrating exceptional performance. These models rely on patching the time series and learning relationships between raw temporal data blocks. We argue that this approach is blind to critical, non-obvious high-frequency information that is complementary to the temporal dynamics. In this letter, we propose Hi-WaveTST, a novel Hybrid architecture that augments the original temporal patch with a learnable, High-Frequency wavelet feature stream. Our wavelet stream uses a deep Wavelet Packet Decomposition (WPD) on each patch and extracts features using a learnable Generalized Mean (GeM) pooling layer. On the UCI-HAR benchmark dataset, our hybrid model achieves a mean accuracy of 93.38 percent plus-minus 0.0043, significantly outperforming the SOTA PatchTST baseline (92.59 percent plus-minus 0.0039). A comprehensive ablation study proves that every component of our design-the hybrid architecture, the deep high-frequency wavelet decomposition, and the learnable GeM pooling-is essential for this state-of-the-art performance.
Similar Papers
STL-FFT-STFT-TCN-LSTM: An Effective Wave Height High Accuracy Prediction Model Fusing Time-Frequency Domain Features
Signal Processing
Predicts big waves to harness ocean power.
WaveTuner: Comprehensive Wavelet Subband Tuning for Time Series Forecasting
Machine Learning (CS)
Improves predictions by analyzing all time-series details.
WaveHiTS: Wavelet-Enhanced Hierarchical Time Series Modeling for Wind Direction Nowcasting in Eastern Inner Mongolia
Machine Learning (CS)
Predicts wind direction better for more wind power.