Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/115841
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Aeronautical and Aviation Engineering | - |
| dc.creator | Zheng, X | - |
| dc.creator | Wen, W | - |
| dc.creator | Hsu, LT | - |
| dc.date.accessioned | 2025-11-05T03:31:24Z | - |
| dc.date.available | 2025-11-05T03:31:24Z | - |
| dc.identifier.issn | 2379-8858 | - |
| dc.identifier.uri | http://hdl.handle.net/10397/115841 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Institute of Electrical and Electronics Engineers | en_US |
| dc.rights | © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | en_US |
| dc.rights | The following publication X. Zheng, W. Wen and L. -T. Hsu, 'Tightly-Coupled Visual/Inertial/Map Integration With Observability Analysis for Reliable Localization of Intelligent Vehicles,' in IEEE Transactions on Intelligent Vehicles, vol. 10, no. 2, pp. 863-875, Feb. 2025 is available at https://doi.org/10.1109/TIV.2024.3419101. | en_US |
| dc.subject | 3D prior map | en_US |
| dc.subject | Intelligent vehicles | en_US |
| dc.subject | Line feature | en_US |
| dc.subject | Localization | en_US |
| dc.subject | Observability analysis | en_US |
| dc.subject | Visual inertial odometry (VIO) | en_US |
| dc.title | Tightly-coupled visual/inertial/map integration with observability analysis for reliable localization of intelligent vehicles | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.spage | 863 | - |
| dc.identifier.epage | 875 | - |
| dc.identifier.volume | 10 | - |
| dc.identifier.issue | 2 | - |
| dc.identifier.doi | 10.1109/TIV.2024.3419101 | - |
| dcterms.abstract | Reliable and cost-effective localization is of great importance for the realization of intelligent vehicles (IV) in complex scenes. The visual-inertial odometry (VIO) can provide high-frequency position estimation over time but is subjected to drift over time. Recently developed map-aided VIO opens a new window for drift-free visual localization but the existing loosely coupled integration fails to fully explore the complementariness of all the raw measurements. Moreover, the observability of the existing map-aided VIO is still to be investigated, which is of great importance for the safety-critical IV. To fill these gaps, in this article, we propose a factor graph-based state estimator that tightly couples a 3D lightweight prior line map with a VIO system and rigorous observability analysis. In particular, for the cross-modality matching between 3D prior maps and 2D images, we first utilize the geometric line structure coexisting in the 3D map and 2D image to build the line feature association model. More importantly, an efficient line-tracking strategy is designed to reject the potential line feature-matching outliers.Meanwhile, a new line feature-based cost model is proposed as a constraint for factor graph optimization with proof of the rationality behind this model. Moreover, we also analyze the observability of the proposed prior line feature-aided VIO system for the first time, the result shows that x, y, and z three global translations are observable and the system only has one unobservable direction theoretically, i.e. the yaw angle around the gravity vector. The proposed system is evaluated on both simulation outdoor and real-world indoor environments, and the results demonstrate the effectiveness of our methods. | - |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | IEEE transactions on intelligent vehicles, Feb. 2025, v. 10, no. 2, p. 863-875 | - |
| dcterms.isPartOf | IEEE transactions on intelligent vehicles | - |
| dcterms.issued | 2025-02 | - |
| dc.identifier.scopus | 2-s2.0-105012246250 | - |
| dc.identifier.eissn | 2379-8904 | - |
| dc.description.validate | 202511 bcjz | - |
| dc.description.oa | Accepted Manuscript | en_US |
| dc.identifier.SubFormID | G000326/2025-08 | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | This work was supported in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2021A1515110771, in part by the Research Centre for Data Sciences and Artificial Intelligence, The Hong Kong Polytechnic University, through the Project \u201CData-driven-assisted GNSS RTK/INS Navigation for Autonomous Systems in Urban Canyons,\u201D and in part by the Research Center of Deep Space Exploration, The Hong Kong Polytechnic University, through the Project Multi-robot Collaborative Operations in Lunar Areas for Regolith Processing. | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



