Score: 0

FRWKV:Frequency-Domain Linear Attention for Long-Term Time Series Forecasting

Published: December 8, 2025 | arXiv ID: 2512.07539v1

By: Qingyuan Yang , Shizhuo , Dongyue Chen and more

Potential Business Impact:

Predicts future events faster and better.

Business Areas:
Facial Recognition Data and Analytics, Software

Traditional Transformers face a major bottleneck in long-sequence time series forecasting due to their quadratic complexity $(\mathcal{O}(T^2))$ and their limited ability to effectively exploit frequency-domain information. Inspired by RWKV's $\mathcal{O}(T)$ linear attention and frequency-domain modeling, we propose FRWKV, a frequency-domain linear-attention framework that overcomes these limitations. Our model integrates linear attention mechanisms with frequency-domain analysis, achieving $\mathcal{O}(T)$ computational complexity in the attention path while exploiting spectral information to enhance temporal feature representations for scalable long-sequence modeling. Across eight real-world datasets, FRWKV achieves a first-place average rank. Our ablation studies confirm the critical roles of both the linear attention and frequency-encoder components. This work demonstrates the powerful synergy between linear attention and frequency analysis, establishing a new paradigm for scalable time series modeling. Code is available at this repository: https://github.com/yangqingyuan-byte/FRWKV.

Country of Origin
🇨🇳 China

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)