Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/117497
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.contributorResearch Centre for Artificial Intelligence in Geomatics-
dc.contributorResearch Institute for Land and Space-
dc.creatorHou, Q-
dc.creatorHou, C-
dc.creatorZhang, F-
dc.creatorWeng, Q-
dc.date.accessioned2026-02-26T03:46:18Z-
dc.date.available2026-02-26T03:46:18Z-
dc.identifier.issn0924-2716-
dc.identifier.urihttp://hdl.handle.net/10397/117497-
dc.language.isoenen_US
dc.publisherElsevier BVen_US
dc.rights© 2025 The Author(s). Published by Elsevier B.V. on behalf of International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). This is an open access article under the CC BY-NC-ND license ( http://creativecommons.org/licenses/by-nc-nd/4.0/ ).en_US
dc.rightsThe following publication Hou, Q., Hou, C., Zhang, F., & Weng, Q. (2025). Multi-source geo-localization in urban built environments for crowd-sourced images by contrastive learning. ISPRS Journal of Photogrammetry and Remote Sensing, 230, 616-629 is available at https://doi.org/10.1016/j.isprsjprs.2025.09.024.en_US
dc.subjectCrowd-sourced imagesen_US
dc.subjectHigh-resolution satellite imagesen_US
dc.subjectImage geo-localizationen_US
dc.subjectMulti-source data fusionen_US
dc.subjectStreet-view imagesen_US
dc.subjectUrban spatial analyticsen_US
dc.titleMulti-source geo-localization in urban built environments for crowd-sourced images by contrastive learningen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage616-
dc.identifier.epage629-
dc.identifier.volume230-
dc.identifier.doi10.1016/j.isprsjprs.2025.09.024-
dcterms.abstractCrowd-sourced images (CSIs) offer an unprecedented opportunity for gaining deeper insights into urban built environments. However, the lack of precise geographic information limits their effectiveness in various urban applications. Traditional geo-localization methods, which rely on matching CSIs with geo-tagged street-view images (SVIs), face significant challenges due to sparse coverage and temporal misalignment of reference data, especially in developing countries. To overcome these limitations, this paper proposes a novel contrastive learning framework that integrates SVIs and satellite images (SIs), utilizing a multi-scale channel attention module and InfoNCE loss to enhance the geo-localization accuracy of CSIs. Additionally, we leverage SIs to generate synthetic SVIs in areas where actual SVIs are unavailable or outdated, ensuring comprehensive coverage across diverse urban environments. A simple yet efficient data preprocessing method is proposed to align multi-view images for enhanced feature fusion. As part of our research efforts, we introduce a Multi-Source Geo-localization Dataset (MSGD) consisting of 500k geo-tagged pairs collected from 12 cities across six continents, encompassing diverse urban typologies from dense skyscraper districts to low-density areas, providing a valuable resource for future research and advancements in geo-localization methods. Our experiments show that the proposed method outperforms state-of-the-art approaches on the challenging MSGD dataset, highlighting the importance of incorporating SIs as a complementary data source for accurate geo-localization. Our code and dataset will be released at https://github.com/RCAIG/CrowdsourcingGeoLocalization.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationISPRS journal of photogrammetry and remote sensing, Dec. 2025, v. 230, p. 616-629-
dcterms.isPartOfISPRS journal of photogrammetry and remote sensing-
dcterms.issued2025-12-
dc.identifier.scopus2-s2.0-105018176301-
dc.identifier.eissn1872-8235-
dc.description.validate202602 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis research has received funding from Global STEM Professorship, Hong Kong SAR Government (P0039329) and Hong Kong Polytechnic University (P0046482 and P0038446).en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
1-s2.0-S092427162500382X-main.pdf7.13 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.