Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91893
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Building and Real Estateen_US
dc.creatorWu, Hen_US
dc.creatorShen, GQen_US
dc.creatorLin, Xen_US
dc.creatorLi, Men_US
dc.creatorLi, CZen_US
dc.date.accessioned2022-01-05T07:13:26Z-
dc.date.available2022-01-05T07:13:26Z-
dc.identifier.issn0926-5805en_US
dc.identifier.urihttp://hdl.handle.net/10397/91893-
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.rights© 2021 Elsevier B.V. All rights reserved.en_US
dc.rights© 2021. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/.en_US
dc.rightsThe following publication Wu, H., Shen, G. Q., Lin, X., Li, M., & Li, C. Z. (2021). A transformer-based deep learning model for recognizing communication-oriented entities from patents of ICT in construction. Automation in Construction, 125, 103608 is available at https://dx.doi.org/10.1016/j.autcon.2021.103608.en_US
dc.subjectInformation and communications technology (ICT)en_US
dc.subjectConstruction industryen_US
dc.subjectEntity recognitionen_US
dc.subjectDeep learningen_US
dc.subjectTransformeren_US
dc.subjectContextual informationen_US
dc.titleA transformer-based deep learning model for recognizing communication-oriented entities from patents of ICT in constructionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume125en_US
dc.identifier.doi10.1016/j.autcon.2021.103608en_US
dcterms.abstractThe patents of information and communication technology (ICT) in construction are valuable sources of technological solutions to communication problems in the construction practice. However, it is often difficult for practitioners and stakeholders to identify the key communication functionalities from complicated expressions in the patent documents. Addressing such challenges, this study develops a deep learning model to enable automatic recognition of communication-oriented entities (CEs) from patent documents. The proposed model is structured based on the Transformer, consisting of feed-forward and self-attention neural networks to better recognize ambiguous and unknown entities by utilizing contextual information. The validation results showed that the proposed model has superior performance in CE recognition than traditional recurrent neural networks (RNN)-based models, especially in recognizing ambiguous and unknown entities. Moreover, experimental results on some research literature and a real-life project report showed satisfactory performance of the model in CE recognition across different document types.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationAutomation in construction, May 2021, v. 125, 103608en_US
dcterms.isPartOfAutomation in constructionen_US
dcterms.issued2021-05-
dc.identifier.isiWOS:000649680900001-
dc.identifier.eissn1872-7891en_US
dc.identifier.artn103608en_US
dc.description.validate202201 bcvcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera1135-n04-
dc.identifier.SubFormID43987-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foun- dation of China (No. 71771067, No. 71801159 and No. 52078302)en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Wu_Transformer-based_Deep_Learning.pdfPre-Published version5.21 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

127
Last Week
2
Last month
Citations as of Apr 14, 2025

Downloads

155
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

22
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

26
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.