Enhancing Time Series Forecasting with Fuzzy Attention-Integrated Transformers
By: Sanjay Chakraborty, Fredrik Heintz
Potential Business Impact:
Helps computers predict future events better.
This paper introduces FANTF (Fuzzy Attention Network-Based Transformers), a novel approach that integrates fuzzy logic with existing transformer architectures to advance time series forecasting, classification, and anomaly detection tasks. FANTF leverages a proposed fuzzy attention mechanism incorporating fuzzy membership functions to handle uncertainty and imprecision in noisy and ambiguous time series data. The FANTF approach enhances its ability to capture complex temporal dependencies and multivariate relationships by embedding fuzzy logic principles into the self-attention module of the existing transformer's architecture. The framework combines fuzzy-enhanced attention with a set of benchmark existing transformer-based architectures to provide efficient predictions, classification and anomaly detection. Specifically, FANTF generates learnable fuzziness attention scores that highlight the relative importance of temporal features and data points, offering insights into its decision-making process. Experimental evaluatios on some real-world datasets reveal that FANTF significantly enhances the performance of forecasting, classification, and anomaly detection tasks over traditional transformer-based models.
Similar Papers
A Neuro-Fuzzy System for Interpretable Long-Term Stock Market Forecasting
Artificial Intelligence
Helps predict stock prices by understanding patterns.
Quantum Temporal Fusion Transformer
Machine Learning (CS)
Quantum computer predicts future better than old ones.
SFANet: Spatial-Frequency Attention Network for Deepfake Detection
CV and Pattern Recognition
Finds fake videos better than before.