Bridging Short- and Long-Term Dependencies: A CNN-Transformer Hybrid for Financial Time Series Forecasting
By: Tiantian Tu
Potential Business Impact:
Predicts stock prices better by seeing short and long trends.
Time series forecasting is crucial for decision-making across various domains, particularly in financial markets where stock prices exhibit complex and non-linear behaviors. Accurately predicting future price movements is challenging due to the difficulty of capturing both short-term fluctuations and long-term dependencies in the data. Convolutional Neural Networks (CNNs) are well-suited for modeling localized, short-term patterns but struggle with long-range dependencies due to their limited receptive field. In contrast, Transformers are highly effective at capturing global temporal relationships and modeling long-term trends. In this paper, we propose a hybrid architecture that combines CNNs and Transformers to effectively model both short- and long-term dependencies in financial time series data. We apply this approach to forecast stock price movements for S\&P 500 constituents and demonstrate that our model outperforms traditional statistical models and popular deep learning methods in intraday stock price forecasting, providing a robust framework for financial prediction.
Similar Papers
CNN-TFT explained by SHAP with multi-head attention weights for time series forecasting
Machine Learning (CS)
Predicts future events more accurately using past data.
Transformer Encoder and Multi-features Time2Vec for Financial Prediction
Machine Learning (CS)
Predicts stock prices better by seeing how companies move together.
Generative Modeling of Networked Time-Series via Transformer Architectures
Machine Learning (CS)
Creates more data to make computer programs smarter.