Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91958
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Building and Real Estate-
dc.creatorSu, MC-
dc.creatorChen, JH-
dc.creatorTrisandini, Azzizi, V-
dc.creatorChang, HL-
dc.creatorWei, HH-
dc.date.accessioned2022-02-07T07:04:34Z-
dc.date.available2022-02-07T07:04:34Z-
dc.identifier.issn0957-4174-
dc.identifier.urihttp://hdl.handle.net/10397/91958-
dc.language.isoenen_US
dc.publisherPergamon Pressen_US
dc.rights© 2021 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Su, M. C., Chen, J. H., Azzizi, V. T., Chang, H. L., & Wei, H. H. (2021). Smart training: Mask R-CNN oriented approach. Expert Systems with Applications, 185, 115595 is available at https://doi.org/10.1016/j.eswa.2021.115595en_US
dc.subjectAugmented realityen_US
dc.subjectFinger-pointing analysisen_US
dc.subjectHand gesture recognitionen_US
dc.subjectMask Regions with Convolutional Neural Network (R-CNN)en_US
dc.subjectSmart trainingen_US
dc.titleSmart training : mask R-CNN oriented approachen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume185-
dc.identifier.doi10.1016/j.eswa.2021.115595-
dcterms.abstractThis paper is aimed at the usage of an augmented reality assisted system set up on the smart-glasses for training activities. Literature review leads us to a comparison among related technologies, yielding that Mask Regions with Convolutional Neural Network (R-CNN) oriented approach fits the study needs. The proposed method including (1) pointing gesture capture, (2) finger-pointing analysis, and (3) virtual tool positioning and rotation angle are developed. Results show that the recognition of object detection is 95.5%, the Kappa value of recognition of gesture detection is 0.93, and the average time for detecting pointing gesture is 0.26 seconds. Furthermore, even under different lighting, such as indoor and outdoor, the pointing analysis accuracy is up to 79%. The error between the analysis angle and the actual angle is only 1.32 degrees. The results proved that the system is well suited to present the effect of augmented reality, making it applicable for real world usage.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationExpert systems with applications, 15 Dec. 2021, v. 185, 115595-
dcterms.isPartOfExpert systems with applications-
dcterms.issued2021-12-
dc.identifier.scopus2-s2.0-85111293710-
dc.identifier.eissn1873-6793-
dc.identifier.artn115595-
dc.description.validate202202 bcvc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis paper was partly supported by the Ministry of Science and Technology (MOST), Taiwan, for promoting academic excellent of universities under grant numbers MOST 109-2221-E-008-059-MY3 and MOST 110-2634-F-008-005.en_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
1-s2.0-S0957417421009957-main.pdf6.44 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

47
Last Week
0
Last month
Citations as of Mar 24, 2024

Downloads

33
Citations as of Mar 24, 2024

SCOPUSTM   
Citations

7
Citations as of Mar 28, 2024

WEB OF SCIENCETM
Citations

3
Citations as of Mar 28, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.