Leveraging Duration Pseudo-Embeddings in Multilevel LSTM and GCN Hypermodels for Outcome-Oriented PPM
By: Fang Wang, Paolo Ceravolo, Ernesto Damiani
Potential Business Impact:
Helps predict future events more accurately.
Existing deep learning models for Predictive Process Monitoring (PPM) struggle with temporal irregularities, particularly stochastic event durations and overlapping timestamps, limiting their adaptability across heterogeneous datasets. We propose a dual input neural network strategy that separates event and sequence attributes, using a duration-aware pseudo-embedding matrix to transform temporal importance into compact, learnable representations. This design is implemented across two baseline families: B-LSTM and B-GCN, and their duration-aware variants D-LSTM and D-GCN. All models incorporate self-tuned hypermodels for adaptive architecture selection. Experiments on balanced and imbalanced outcome prediction tasks show that duration pseudo-embedding inputs consistently improve generalization, reduce model complexity, and enhance interpretability. Our results demonstrate the benefits of explicit temporal encoding and provide a flexible design for robust, real-world PPM applications.
Similar Papers
Comprehensive Attribute Encoding and Dynamic LSTM HyperModels for Outcome Oriented Predictive Business Process Monitoring
Machine Learning (CS)
Predicts business problems before they happen.
Time-Aware and Transition-Semantic Graph Neural Networks for Interpretable Predictive Business Process Monitoring
Machine Learning (CS)
Predicts future business steps by learning from past actions.
On the Simplification of Neural Network Architectures for Predictive Process Monitoring
Machine Learning (CS)
Makes smart computer predictions much faster and smaller.