Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/89329
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Mechanical Engineeringen_US
dc.contributorInterdisciplinary Division of Aeronautical and Aviation Engineeringen_US
dc.creatorFeng, Yen_US
dc.creatorTse, Ken_US
dc.creatorChen, Sen_US
dc.creatorWen, CYen_US
dc.creatorLi, Ben_US
dc.date.accessioned2021-03-12T09:35:59Z-
dc.date.available2021-03-12T09:35:59Z-
dc.identifier.issn1424-8220en_US
dc.identifier.urihttp://hdl.handle.net/10397/89329-
dc.language.isoenen_US
dc.publisherMolecular Diversity Preservation International (MDPI)en_US
dc.rights© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Feng, Y.; Tse, K.; Chen, S.; Wen, C.-Y.; Li, B. Learning-Based Autonomous UAV System for Electrical and Mechanical (E&M) Device Inspection. Sensors 2021, 21 (4), 1385, 1-19 is availiable at https://doi.org10.3390/s21041385en_US
dc.subjectAutonomous inspectionen_US
dc.subjectDeep learningen_US
dc.subjectObject detectionen_US
dc.subjectUAVen_US
dc.titleLearning-based autonomous uav system for electrical and mechanical (E&M) device inspectionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1en_US
dc.identifier.epage19en_US
dc.identifier.volume21en_US
dc.identifier.issue4en_US
dc.identifier.doi10.3390/s21041385en_US
dcterms.abstractThe inspection of electrical and mechanical (E&M) devices using unmanned aerial vehicles (UAVs) has become an increasingly popular choice in the last decade due to their flexibility and mobility. UAVs have the potential to reduce human involvement in visual inspection tasks, which could increase efficiency and reduce risks. This paper presents a UAV system for autonomously performing E&M device inspection. The proposed system relies on learning-based detection for perception, multi-sensor fusion for localization, and path planning for fully autonomous inspection. The perception method utilizes semantic and spatial information generated by a 2-D object detector. The information is then fused with depth measurements for object state estimation. No prior knowledge about the location and category of the target device is needed. The system design is validated by flight experiments using a quadrotor platform. The result shows that the proposed UAV system enables the inspection mission autonomously and ensures a stable and collision-free flight.en_US
dcterms.accessRightsopen access-
dcterms.bibliographicCitationSensors (Switzerland), 2 Feb. 2021, v. 21, no. 4, 1385, p. 1-19en_US
dcterms.isPartOfSensors (Switzerland)en_US
dcterms.issued2021-02-02-
dc.identifier.scopus2-s2.0-85100970220-
dc.identifier.artn1385en_US
dc.description.validate202103 bcvcen_US
dc.description.oaVersion of Record-
dc.identifier.FolderNumbera0618-n01, a0716-n02-
dc.identifier.SubFormID610, 1039-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublished-
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
sensors-21-01385-v2.pdf47.01 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

106
Last Week
0
Last month
Citations as of Apr 14, 2024

Downloads

58
Citations as of Apr 14, 2024

SCOPUSTM   
Citations

15
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

12
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.