UrbanAI 2025 Challenge: Linear vs Transformer Models for Long-Horizon Exogenous Temperature Forecasting
By: Ruslan Gokhman
Potential Business Impact:
Simple models predict room temperature better than complex ones.
We study long-horizon exogenous-only temperature forecasting - a challenging univariate setting where only the past values of the indoor temperature are used for prediction - using linear and Transformer-family models. We evaluate Linear, NLinear, DLinear, Transformer, Informer, and Autoformer under standardized train, validation, and test splits. Results show that linear baselines (Linear, NLinear, DLinear) consistently outperform more complex Transformer-family architectures, with DLinear achieving the best overall accuracy across all splits. These findings highlight that carefully designed linear models remain strong baselines for time series forecasting in challenging exogenous-only settings.
Similar Papers
UNet with Axial Transformer : A Neural Weather Model for Precipitation Nowcasting
Machine Learning (CS)
Predicts rain and storms much faster and better.
Why Do Transformers Fail to Forecast Time Series In-Context?
Machine Learning (CS)
Makes computers predict future events more accurately.
Rethinking deep learning: linear regression remains a key benchmark in predicting terrestrial water storage
Machine Learning (CS)
Simple math predicts water better than complex AI.