Partially Observable Markov Decision Process Framework for Operating Condition Optimization Using Real-Time Degradation Signals
By: Boyang Xu , Yunyi Kang , Xinyu Zhao and more
Potential Business Impact:
Helps machines predict problems before they break.
In many engineering systems, proper predictive maintenance and operational control are essential to increase efficiency and reliability while reducing maintenance costs. However, one of the major challenges is that many sensors are used for system monitoring. Analyzing these sensors simultaneously for better predictive maintenance optimization is often very challenging. In this paper, we propose a systematic decision-making framework to improve the system performance in manufacturing practice, considering the real-time degradation signals generated by multiple sensors. Specifically, we propose a partially observed Markov decision process (POMDP) model to generate the optimal capacity and predictive maintenance policies, given the fact that the observation of the system state is imperfect. Such work provides a systematic approach that focuses on jointly controlling the operating conditions and preventive maintenance utilizing the real-time machine deterioration signals by incorporating the degradation constraint and non-observable states. We apply this technique to the bearing degradation data and NASA aircraft turbofan engine dataset, demonstrating the effectiveness of the proposed method.
Similar Papers
Optimizing Predictive Maintenance in Intelligent Manufacturing: An Integrated FNO-DAE-GNN-PPO MDP Framework
Machine Learning (CS)
Fixes machines before they break, saving money.
A Sensor-Driven Optimization Framework for Asset Management in Energy Systems: Implications for Full and Partial Digital Transformation in Hydro Fleets
Optimization and Control
Predicts when machines will break to save money.
From CAD to POMDP: Probabilistic Planning for Robotic Disassembly of End-of-Life Products
Robotics
Robots learn to take apart broken things better.