Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/108440
| Title: | Fusion of aerial, MMS and backpack images and point clouds for optimized 3D mapping in urban areas | Authors: | Li, Z Wu, B Li, Y Chen, Z |
Issue Date: | Aug-2023 | Source: | ISPRS journal of photogrammetry and remote sensing, Aug. 2023, v. 202, p. 463-478 | Abstract: | Photorealistic 3D models are important data sources for digital twin cities and smart city applications. These models are usually generated from data collected by aerial or ground-based platforms (e.g., mobile mapping systems (MMSs) and backpack systems) separately. Aerial and ground-based platforms capture data from overhead and ground surfaces, respectively, offering complementary information for better 3D mapping in urban areas. Particularly, backpack mapping systems have gained popularity for 3D mapping in urban areas in recent years, as they offer more flexibility to reach regions (e.g., narrow alleys and pedestrian routes) inaccessible by vehicle-based MMSs. However, integration of aerial and ground data for 3D mapping suffers from difficulties such as tie-point matching among images from different platforms with large differences in perspective, coverage, and scale. Optimal fusion of the results from different platforms is also challenging. Therefore, this paper presents a novel method for the fusion of aerial, MMS, and backpack images and point clouds for optimized 3D mapping in urban areas. A geometric-aware model for feature matching is developed based on the SuperGlue algorithm to obtain sufficient tie-points between aerial and ground images, which facilitates the integrated bundle adjustment of images to reduce their geometric inconsistencies and the subsequent dense image matching to generate 3D point clouds from different image sources. After that, a graph-based method considering both geometric and texture traits is developed for the optimal fusion of point clouds from different sources to generate 3D mesh models of better quality. Experiments conducted on a challenging dataset in Hong Kong demonstrated that the geometric-aware model could obtain sufficient accurately matched tie-points among the aerial, MMS, and backpack images, which enabled the integrated bundle adjustment of the three image datasets to generate properly aligned point clouds. Compared with the results obtained from state-of-the-art commercial software, the 3D mesh models generated from the proposed point cloud fusion method exhibited better quality in terms of completeness, consistency, and level of detail. | Keywords: | 3D mapping Aerial oblique imagery Backpack Mobile mapping system (MMS) |
Publisher: | Elsevier BV | Journal: | ISPRS journal of photogrammetry and remote sensing | ISSN: | 0924-2716 | EISSN: | 1872-8235 | DOI: | 10.1016/j.isprsjprs.2023.07.010 | Rights: | © 2023 The Author(s). Published by Elsevier B.V. on behalf of International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). The following publication Li, Z., Wu, B., Li, Y., & Chen, Z. (2023). Fusion of aerial, MMS and backpack images and point clouds for optimized 3D mapping in urban areas. ISPRS Journal of Photogrammetry and Remote Sensing, 202, 463-478 is available at https://doi.org/10.1016/j.isprsjprs.2023.07.010. |
| Appears in Collections: | Journal/Magazine Article |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 1-s2.0-S0924271623001910-main.pdf | 42.18 MB | Adobe PDF | View/Open |
Page views
55
Citations as of Nov 10, 2025
Downloads
31
Citations as of Nov 10, 2025
SCOPUSTM
Citations
18
Citations as of Dec 19, 2025
WEB OF SCIENCETM
Citations
18
Citations as of Dec 18, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



