Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/97701
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical Engineeringen_US
dc.creatorGao, Xen_US
dc.creatorMa, Hen_US
dc.creatorChan, KWen_US
dc.creatorXia, Sen_US
dc.creatorZhu, Zen_US
dc.date.accessioned2023-03-09T07:42:49Z-
dc.date.available2023-03-09T07:42:49Z-
dc.identifier.urihttp://hdl.handle.net/10397/97701-
dc.language.isoenen_US
dc.publisherFrontiers Research Foundationen_US
dc.rightsCopyright © 2021 Gao, Ma, Chan, Xia and Zhu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) (https://creativecommons.org/licenses/by/4.0/). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.en_US
dc.rightsThe following publication Gao X, Ma H, Chan KW, Xia S and Zhu Z (2021) A Learning-Based Bidding Approach for PV-Attached BESS Power Plants. Front. Energy Res. 9:750796 is available at https://doi.org/10.3389/fenrg.2021.750796.en_US
dc.subjectBESSen_US
dc.subjectBidding strategyen_US
dc.subjectIncomplete information gameen_US
dc.subjectMultiagent reinforcement learningen_US
dc.subjectPVen_US
dc.subjectWoLF-PHCen_US
dc.titleA learning-based bidding approach for PV-attached BESS power plantsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume9en_US
dc.identifier.doi10.3389/fenrg.2021.750796en_US
dcterms.abstractLarge-scale renewable photovoltaic (PV) and battery energy storage system (BESS) units are promising to be significant electricity suppliers in the future electricity market. A bidding model is proposed for PV-integrated BESS power plants in a pool-based day-ahead (DA) electricity market, in which the uncertainty of PV generation output is considered. In the proposed model, we consider the market clearing process as the external environment, while each agent updates the bid price through the communication with the market environment for its revenue maximization. A multiagent reinforcement learning (MARL) called win-or-learn-fast policy-hill-climbing (WoLF-PHC) is used to explore optimal bid prices without any information of opponents. The case study validates the computational performance of WoLF-PHC in the proposed model, while the bidding strategy of each participated agent is thereafter analyzed.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationFrontiers in Energy Research, Oct. 2021, v. 9, 750796en_US
dcterms.isPartOfFrontiers in energy researchen_US
dcterms.issued2021-10-
dc.identifier.isiWOS:000712779800001-
dc.identifier.scopus2-s2.0-85117696028-
dc.identifier.eissn2296-598Xen_US
dc.identifier.artn750796en_US
dc.description.validate202303 bcwwen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOS-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextBK20180284; Shenzhen Polytechnic, SZPT; National Natural Science Foundation of China, NSFC: 52077075; Natural Science Foundation of Guangdong Province: 2020A1515010461; Hong Kong Polytechnic University, PolyUen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Gao_learning-based_bidding_approach.pdf2.13 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

80
Citations as of Apr 14, 2025

Downloads

37
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

3
Citations as of Sep 12, 2025

WEB OF SCIENCETM
Citations

2
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.