Nonintrusive Load Disaggregation Based on Attention Neural Networks
Nonintrusive load monitoring (NILM), also known as energy disaggregation, infers the energy consumption of individual appliances from household metered electricity data. Recently, NILM has garnered significant attention as it can assist households in reducing energy usage and improving their electri...
Saved in:
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2025-01-01
|
Series: | International Transactions on Electrical Energy Systems |
Online Access: | http://dx.doi.org/10.1155/etep/3405849 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832575514905149440 |
---|---|
author | Shunfu Lin Jiayu Yang Yi Li Yunwei Shen Fangxing Li Xiaoyan Bian Dongdong Li |
author_facet | Shunfu Lin Jiayu Yang Yi Li Yunwei Shen Fangxing Li Xiaoyan Bian Dongdong Li |
author_sort | Shunfu Lin |
collection | DOAJ |
description | Nonintrusive load monitoring (NILM), also known as energy disaggregation, infers the energy consumption of individual appliances from household metered electricity data. Recently, NILM has garnered significant attention as it can assist households in reducing energy usage and improving their electricity behaviors. In this paper, we propose a two-subnetwork model consisting of a regression subnetwork and a seq2point-based classification subnetwork for NILM. In the regression subnetwork, stacked dilated convolutions are utilized to extract multiscale features. Subsequently, a self-attention mechanism is applied to the multiscale features to obtain their contextual representations. The proposed model, compared to existing load disaggregation models, has a larger receptive field and can capture crucial information within the data. The study utilizes the low-frequency UK-DALE dataset, released in 2015, containing timestamps, power of various appliances, and device state labels. House1 and House5 are employed as the training set, while House2 data is reserved for testing. The proposed model achieves lower errors for all appliances compared to other algorithms. Specifically, the proposed model shows a 13.85% improvement in mean absolute error (MAE), a 21.27% improvement in signal aggregate error (SAE), and a 26.15% improvement in F1 score over existing algorithms. Our proposed approach evidently exhibits superior disaggregation accuracy compared to existing methods. |
format | Article |
id | doaj-art-596cf15973614d55bd6ef28ee9aad210 |
institution | Kabale University |
issn | 2050-7038 |
language | English |
publishDate | 2025-01-01 |
publisher | Wiley |
record_format | Article |
series | International Transactions on Electrical Energy Systems |
spelling | doaj-art-596cf15973614d55bd6ef28ee9aad2102025-02-01T00:00:01ZengWileyInternational Transactions on Electrical Energy Systems2050-70382025-01-01202510.1155/etep/3405849Nonintrusive Load Disaggregation Based on Attention Neural NetworksShunfu Lin0Jiayu Yang1Yi Li2Yunwei Shen3Fangxing Li4Xiaoyan Bian5Dongdong Li6College of Electrical EngineeringCollege of Electrical EngineeringCollege of Electrical EngineeringCollege of Electrical EngineeringDepartment of Electrical Engineering and Computer ScienceCollege of Electrical EngineeringCollege of Electrical EngineeringNonintrusive load monitoring (NILM), also known as energy disaggregation, infers the energy consumption of individual appliances from household metered electricity data. Recently, NILM has garnered significant attention as it can assist households in reducing energy usage and improving their electricity behaviors. In this paper, we propose a two-subnetwork model consisting of a regression subnetwork and a seq2point-based classification subnetwork for NILM. In the regression subnetwork, stacked dilated convolutions are utilized to extract multiscale features. Subsequently, a self-attention mechanism is applied to the multiscale features to obtain their contextual representations. The proposed model, compared to existing load disaggregation models, has a larger receptive field and can capture crucial information within the data. The study utilizes the low-frequency UK-DALE dataset, released in 2015, containing timestamps, power of various appliances, and device state labels. House1 and House5 are employed as the training set, while House2 data is reserved for testing. The proposed model achieves lower errors for all appliances compared to other algorithms. Specifically, the proposed model shows a 13.85% improvement in mean absolute error (MAE), a 21.27% improvement in signal aggregate error (SAE), and a 26.15% improvement in F1 score over existing algorithms. Our proposed approach evidently exhibits superior disaggregation accuracy compared to existing methods.http://dx.doi.org/10.1155/etep/3405849 |
spellingShingle | Shunfu Lin Jiayu Yang Yi Li Yunwei Shen Fangxing Li Xiaoyan Bian Dongdong Li Nonintrusive Load Disaggregation Based on Attention Neural Networks International Transactions on Electrical Energy Systems |
title | Nonintrusive Load Disaggregation Based on Attention Neural Networks |
title_full | Nonintrusive Load Disaggregation Based on Attention Neural Networks |
title_fullStr | Nonintrusive Load Disaggregation Based on Attention Neural Networks |
title_full_unstemmed | Nonintrusive Load Disaggregation Based on Attention Neural Networks |
title_short | Nonintrusive Load Disaggregation Based on Attention Neural Networks |
title_sort | nonintrusive load disaggregation based on attention neural networks |
url | http://dx.doi.org/10.1155/etep/3405849 |
work_keys_str_mv | AT shunfulin nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT jiayuyang nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT yili nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT yunweishen nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT fangxingli nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT xiaoyanbian nonintrusiveloaddisaggregationbasedonattentionneuralnetworks AT dongdongli nonintrusiveloaddisaggregationbasedonattentionneuralnetworks |