TimePerceiver: An Encoder-Decoder Framework for Generalized Time-Series Forecasting
By: Jaebin Lee, Hankook Lee
In machine learning, effective modeling requires a holistic consideration of how to encode inputs, make predictions (i.e., decoding), and train the model. However, in time-series forecasting, prior work has predominantly focused on encoder design, often treating prediction and training as separate or secondary concerns. In this paper, we propose TimePerceiver, a unified encoder-decoder forecasting framework that is tightly aligned with an effective training strategy. To be specific, we first generalize the forecasting task to include diverse temporal prediction objectives such as extrapolation, interpolation, and imputation. Since this generalization requires handling input and target segments that are arbitrarily positioned along the temporal axis, we design a novel encoder-decoder architecture that can flexibly perceive and adapt to these varying positions. For encoding, we introduce a set of latent bottleneck representations that can interact with all input segments to jointly capture temporal and cross-channel dependencies. For decoding, we leverage learnable queries corresponding to target timestamps to effectively retrieve relevant information. Extensive experiments demonstrate that our framework consistently and significantly outperforms prior state-of-the-art baselines across a wide range of benchmark datasets. The code is available at https://github.com/efficient-learning-lab/TimePerceiver.
Similar Papers
Accelerating Time Series Foundation Models with Speculative Decoding
Machine Learning (CS)
Speeds up predictions for websites and apps.
Forecasting from Clinical Textual Time Series: Adaptations of the Encoder and Decoder Language Model Families
Computation and Language
Helps doctors predict patient health from notes.
TimeFound: A Foundation Model for Time Series Forecasting
Machine Learning (CS)
Predicts future events without prior training.