Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/92770
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Aeronautical and Aviation Engineeringen_US
dc.creatorBai, Xen_US
dc.creatorZhang, Ben_US
dc.creatorWen, Wen_US
dc.creatorHsu, LTen_US
dc.creatorLi, Hen_US
dc.date.accessioned2022-05-16T09:07:39Z-
dc.date.available2022-05-16T09:07:39Z-
dc.identifier.isbn978-1-7281-0244-3 (Electronic ISBN)en_US
dc.identifier.isbn978-1-7281-9446-2 (Print on Demand(PoD) ISBN)en_US
dc.identifier.urihttp://hdl.handle.net/10397/92770-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Bai, X., Zhang, B., Wen, W., Hsu, L. T., & Li, H. (2020, April). Perception-aided visual-inertial integrated positioning in dynamic urban areas. In 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS) (pp. 1563-1571). IEEE is available at https://doi.org/10.1109/PLANS46316.2020.9109963en_US
dc.subjectINSen_US
dc.subjectNavigationen_US
dc.subjectPositioningen_US
dc.subjectUrban Areasen_US
dc.subjectVINSen_US
dc.subjectVisual Odometryen_US
dc.titlePerception-aided visual-inertial integrated positioning in dynamic urban areasen_US
dc.typeConference Paperen_US
dc.identifier.spage1563en_US
dc.identifier.epage1571en_US
dc.identifier.doi10.1109/PLANS46316.2020.9109963en_US
dcterms.abstractVisual-inertial navigation systems (VINS) have been extensively studied in the past decades to provide positioning services for autonomous systems, such as autonomous driving vehicles (ADV) and unmanned aerial vehicles (UAV). Decent performance can be obtained by VINS in indoor scenarios with stable illumination and texture information. Unfortunately, applying the VINS in dynamic urban areas is still a challenging problem, due to the excessive dynamic objects which can significantly degrade the performance of VINS. Detecting and removing the features inside an image using the deep neural network (DNN) that belongs to unexpected objects, such as moving vehicles and pedestrians, is a straightforward idea to mitigate the impacts of dynamic objects on VINS. However, excessive exclusion of features can significantly distort the geometry distribution of visual features. Even worse, excessive removal can cause the unobservability of the system states. Instead of directly excluding the features that possibly belong to dynamic objects, this paper proposes to remodel the uncertainty of dynamic features. Then both the healthy and dynamic features are applied in the VINS. The experiment in a typical urban canyon is conducted to validate the performance of the proposed method. The result shows that the proposed method can effectively mitigate the impacts of the dynamic objects and improved accuracy is obtained.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitation2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), 20-23 April 2020, Portland, OR, USA, p. 1563-1571en_US
dcterms.issued2020-
dc.identifier.scopus2-s2.0-85085578671-
dc.relation.conferenceIEEE/ION Position, Location and Navigation Symposium [PLANS]en_US
dc.description.validate202205 bckwen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberAAE-0086-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThe Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS23858526-
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Bai_Perception-Aided_Visual-Inertial_Integrated.pdfPre-Published version2.53 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

64
Last Week
0
Last month
Citations as of Apr 21, 2024

Downloads

101
Citations as of Apr 21, 2024

SCOPUSTM   
Citations

9
Citations as of Apr 26, 2024

WEB OF SCIENCETM
Citations

7
Citations as of Apr 25, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.