Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115841
DC FieldValueLanguage
dc.contributorDepartment of Aeronautical and Aviation Engineering-
dc.creatorZheng, X-
dc.creatorWen, W-
dc.creatorHsu, LT-
dc.date.accessioned2025-11-05T03:31:24Z-
dc.date.available2025-11-05T03:31:24Z-
dc.identifier.issn2379-8858-
dc.identifier.urihttp://hdl.handle.net/10397/115841-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication X. Zheng, W. Wen and L. -T. Hsu, 'Tightly-Coupled Visual/Inertial/Map Integration With Observability Analysis for Reliable Localization of Intelligent Vehicles,' in IEEE Transactions on Intelligent Vehicles, vol. 10, no. 2, pp. 863-875, Feb. 2025 is available at https://doi.org/10.1109/TIV.2024.3419101.en_US
dc.subject3D prior mapen_US
dc.subjectIntelligent vehiclesen_US
dc.subjectLine featureen_US
dc.subjectLocalizationen_US
dc.subjectObservability analysisen_US
dc.subjectVisual inertial odometry (VIO)en_US
dc.titleTightly-coupled visual/inertial/map integration with observability analysis for reliable localization of intelligent vehiclesen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage863-
dc.identifier.epage875-
dc.identifier.volume10-
dc.identifier.issue2-
dc.identifier.doi10.1109/TIV.2024.3419101-
dcterms.abstractReliable and cost-effective localization is of great importance for the realization of intelligent vehicles (IV) in complex scenes. The visual-inertial odometry (VIO) can provide high-frequency position estimation over time but is subjected to drift over time. Recently developed map-aided VIO opens a new window for drift-free visual localization but the existing loosely coupled integration fails to fully explore the complementariness of all the raw measurements. Moreover, the observability of the existing map-aided VIO is still to be investigated, which is of great importance for the safety-critical IV. To fill these gaps, in this article, we propose a factor graph-based state estimator that tightly couples a 3D lightweight prior line map with a VIO system and rigorous observability analysis. In particular, for the cross-modality matching between 3D prior maps and 2D images, we first utilize the geometric line structure coexisting in the 3D map and 2D image to build the line feature association model. More importantly, an efficient line-tracking strategy is designed to reject the potential line feature-matching outliers.Meanwhile, a new line feature-based cost model is proposed as a constraint for factor graph optimization with proof of the rationality behind this model. Moreover, we also analyze the observability of the proposed prior line feature-aided VIO system for the first time, the result shows that x, y, and z three global translations are observable and the system only has one unobservable direction theoretically, i.e. the yaw angle around the gravity vector. The proposed system is evaluated on both simulation outdoor and real-world indoor environments, and the results demonstrate the effectiveness of our methods.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on intelligent vehicles, Feb. 2025, v. 10, no. 2, p. 863-875-
dcterms.isPartOfIEEE transactions on intelligent vehicles-
dcterms.issued2025-02-
dc.identifier.scopus2-s2.0-105012246250-
dc.identifier.eissn2379-8904-
dc.description.validate202511 bcjz-
dc.description.oaAccepted Manuscripten_US
dc.identifier.SubFormIDG000326/2025-08en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2021A1515110771, in part by the Research Centre for Data Sciences and Artificial Intelligence, The Hong Kong Polytechnic University, through the Project \u201CData-driven-assisted GNSS RTK/INS Navigation for Autonomous Systems in Urban Canyons,\u201D and in part by the Research Center of Deep Space Exploration, The Hong Kong Polytechnic University, through the Project Multi-robot Collaborative Operations in Lunar Areas for Regolith Processing.en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.