Data-Driven Policy Mapping for Safe RL-based Energy Management Systems
By: Theo Zangato, Aomar Osmani, Pegah Alizadeh
Potential Business Impact:
Saves money by making buildings use less energy.
Increasing global energy demand and renewable integration complexity have placed buildings at the center of sustainable energy management. We present a three-step reinforcement learning(RL)-based Building Energy Management System (BEMS) that combines clustering, forecasting, and constrained policy learning to address scalability, adaptability, and safety challenges. First, we cluster non-shiftable load profiles to identify common consumption patterns, enabling policy generalization and transfer without retraining for each new building. Next, we integrate an LSTM based forecasting module to anticipate future states, improving the RL agents' responsiveness to dynamic conditions. Lastly, domain-informed action masking ensures safe exploration and operation, preventing harmful decisions. Evaluated on real-world data, our approach reduces operating costs by up to 15% for certain building types, maintains stable environmental performance, and quickly classifies and optimizes new buildings with limited data. It also adapts to stochastic tariff changes without retraining. Overall, this framework delivers scalable, robust, and cost-effective building energy management.
Similar Papers
Deep reinforcement learning-based joint real-time energy scheduling for green buildings with heterogeneous battery energy storage devices
Systems and Control
Saves money by smartly charging electric cars.
STEMS: Spatial-Temporal Enhanced Safe Multi-Agent Coordination for Building Energy Management
Artificial Intelligence
Saves energy and money in buildings safely.
Integration of Multi-Mode Preference into Home Energy Management System Using Deep Reinforcement Learning
Machine Learning (CS)
Smart homes learn your comfort to save energy.