Prompting Large Language Models for Training-Free Non-Intrusive Load Monitoring
By: Junyu Xue , Xudong Wang , Xiaoling He and more
Potential Business Impact:
Lets smart meters show what uses power.
Non-intrusive load monitoring (NILM) aims to disaggregate total electricity consumption into individual appliance usage, thus enabling more effective energy management. While deep learning has advanced NILM, it remains limited by its dependence on labeled data, restricted generalization, and lack of explainability. This paper introduces the first prompt-based NILM framework that leverages large language models (LLMs) with in-context learning. We design and evaluate prompt strategies that integrate appliance features, contextual information, and representative time-series examples through extensive case studies. Extensive experiments on the REDD and UK-DALE datasets show that LLMs guided solely by prompts deliver only basic NILM capabilities, with performance that lags behind traditional deep-learning models in complex scenarios. However, the experiments also demonstrate strong generalization across different houses and even regions by simply adapting the injected appliance features. It also provides clear, human-readable explanations for the inferred appliance states. Our findings define the capability boundaries of using prompt-only LLMs for NILM tasks. Their strengths in generalization and explainability present a promising new direction for the field.
Similar Papers
Edge-Optimized Deep Learning & Pattern Recognition Techniques for Non-Intrusive Load Monitoring of Energy Time Series
Machine Learning (CS)
Shows how much electricity each appliance uses.
NILMFormer: Non-Intrusive Load Monitoring that Accounts for Non-Stationarity
Machine Learning (CS)
Shows which appliances use the most electricity.
Few Labels are all you need: A Weakly Supervised Framework for Appliance Localization in Smart-Meter Series
Machine Learning (CS)
Lets smart meters tell what appliances use power.