Score: 0

FTT-GRU: A Hybrid Fast Temporal Transformer with GRU for Remaining Useful Life Prediction

Published: November 1, 2025 | arXiv ID: 2511.00564v1

By: Varun Teja Chirukiri , Udaya Bhasker Cheerala , Sandeep Kanta and more

Potential Business Impact:

Predicts when machines will break down accurately.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Accurate prediction of the remaining useful life (RUL) of industrial machinery is essential for reducing downtime and optimizing maintenance schedules. Existing approaches, such as long short-term memory (LSTM) networks and convolutional neural networks (CNNs), often struggle to model both global temporal dependencies and fine-grained degradation trends in multivariate sensor data. We propose a hybrid model, FTT-GRU, which combines a Fast Temporal Transformer (FTT) -- a lightweight Transformer variant using linearized attention via fast Fourier transform (FFT) -- with a gated recurrent unit (GRU) layer for sequential modeling. To the best of our knowledge, this is the first application of an FTT with a GRU for RUL prediction on NASA CMAPSS, enabling simultaneous capture of global and local degradation patterns in a compact architecture. On CMAPSS FD001, FTT-GRU attains RMSE 30.76, MAE 18.97, and $R^2=0.45$, with 1.12 ms CPU latency at batch=1. Relative to the best published deep baseline (TCN--Attention), it improves RMSE by 1.16\% and MAE by 4.00\%. Training curves averaged over $k=3$ runs show smooth convergence with narrow 95\% confidence bands, and ablations (GRU-only, FTT-only) support the contribution of both components. These results demonstrate that a compact Transformer-RNN hybrid delivers accurate and efficient RUL predictions on CMAPSS, making it suitable for real-time industrial prognostics.

Country of Origin
🇰🇷 Korea, Republic of

Page Count
5 pages

Category
Computer Science:
Machine Learning (CS)