Digitala Vetenskapliga Arkivet

Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Improved electric load forecasting using quantile long short-term memory network with dual attention mechanism
Mirpur Univ Sci & Technol MUST, Dept Elect Engn, Mirpur 10250, Azad Jammu & Ka, Pakistan..
Mirpur Univ Sci & Technol MUST, Dept Elect Engn, Mirpur 10250, Azad Jammu & Ka, Pakistan..
Mirpur Univ Sci & Technol MUST, Dept Elect Engn, Mirpur 10250, Azad Jammu & Ka, Pakistan..
Univ York, Dept Comp Sci, Heslington YO10 5GH, England..
Show others and affiliations
2025 (English)In: Energy Reports, E-ISSN 2352-4847, Vol. 13, p. 2343-2353Article in journal (Refereed) Published
Abstract [en]

The robust and accurate load forecasting is necessary to ensure effective power market operations and optimize load dispatch strategies. Deep learning models have recently gained popularity because of their strong ability to learn data patterns. However, conventional deep-learning models still encounter difficulties in precisely predicting complex load patterns. This paper addresses the difficulties of forecasting intricate load patterns, where conventional deep learning models often fail. Therefore, a novel quantile long short-term memory network with dual attention is proposed for hour-ahead short-term load forecasting. By combining dual attention processes with quantile regression-based long short-term memory networks, the proposed framework effectively captures the temporal dependencies of the complex load pattern. The gates recurrent unit and hybridized methodologies of recurrent neural networks are among the baseline techniques against which the proposed method is thoroughly tested using datasets from Panama City and the Islamabad Electric Supply Company. The proposed quantile long short-term memory network with dual attention mechanism has demonstrated notable performance gains with 2.35% and 5.36% reduction in mean absolute percentage error in comparison to the best-performing models, from the set of baseline models, for the Panama and IESCO datasets, respectively. These results demonstrate the proposed method's effectiveness in providing more improved and accurate forecasts for enhanced grid stability and economic dispatch efficiency.

Place, publisher, year, edition, pages
Elsevier, 2025. Vol. 13, p. 2343-2353
Keywords [en]
Dual attention, Quantile loss function, Load forecasting, Hybrid methodologies, Long short-term memory network
National Category
Energy Engineering
Identifiers
URN: urn:nbn:se:uu:diva-551747DOI: 10.1016/j.egyr.2025.01.058ISI: 001425246700001Scopus ID: 2-s2.0-85216924147OAI: oai:DiVA.org:uu-551747DiVA, id: diva2:1947577
Available from: 2025-03-26 Created: 2025-03-26 Last updated: 2025-03-26Bibliographically approved

Open Access in DiVA

fulltext(2109 kB)64 downloads
File information
File name FULLTEXT01.pdfFile size 2109 kBChecksum SHA-512
2cd58de04d0c34324619f351dfd39e20178b787ada8f86a67ba7273054f6aab932cc08080ab9d82b3f0c3701abe8172e68939a52ec5e804fb057a4b6106b6372
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Aziz, Imran
By organisation
FREIA
In the same journal
Energy Reports
Energy Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 67 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 198 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf