LoFT-LLM: Low-Frequency Time-Series Forecasting with Large Language Models
By: Jiacheng You , Jingcheng Yang , Yuhang Xie and more
Time-series forecasting in real-world applications such as finance and energy often faces challenges due to limited training data and complex, noisy temporal dynamics. Existing deep forecasting models typically supervise predictions using full-length temporal windows, which include substantial high-frequency noise and obscure long-term trends. Moreover, auxiliary variables containing rich domain-specific information are often underutilized, especially in few-shot settings. To address these challenges, we propose LoFT-LLM, a frequency-aware forecasting pipeline that integrates low-frequency learning with semantic calibration via a large language model (LLM). Firstly, a Patch Low-Frequency forecasting Module (PLFM) extracts stable low-frequency trends from localized spectral patches. Secondly, a residual learner then models high-frequency variations. Finally, a fine-tuned LLM refines the predictions by incorporating auxiliary context and domain knowledge through structured natural language prompts. Extensive experiments on financial and energy datasets demonstrate that LoFT-LLM significantly outperforms strong baselines under both full-data and few-shot regimes, delivering superior accuracy, robustness, and interpretability.
Similar Papers
Semantic-Enhanced Time-Series Forecasting via Large Language Models
Machine Learning (CS)
Helps computers predict future events better.
Semantic-Enhanced Time-Series Forecasting via Large Language Models
Machine Learning (CS)
Helps computers predict future events better.
LLM4FTS: Enhancing Large Language Models for Financial Time Series Prediction
Machine Learning (CS)
Helps computers predict stock prices better.