Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/99910
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorWan, Q-
dc.creatorYu, Y-
dc.creatorChen, R-
dc.creatorChen, L-
dc.date.accessioned2023-07-26T05:48:56Z-
dc.date.available2023-07-26T05:48:56Z-
dc.identifier.urihttp://hdl.handle.net/10397/99910-
dc.language.isoenen_US
dc.publisherMDPIen_US
dc.rights© 2022 by the authors. Licensee MDPI, Basel, Switzerland.en_US
dc.rightsThis article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Wan Q, Yu Y, Chen R, Chen L. Map-Assisted 3D Indoor Localization Using Crowd-Sensing-Based Trajectory Data and Error Ellipse-Enhanced Fusion. Remote Sensing. 2022; 14(18):4636 is available at https://doi.org/10.3390/rs14184636.en_US
dc.subjectCrowd-sensingen_US
dc.subjectWalking speed estimatoren_US
dc.subjectFingerprinting databaseen_US
dc.subjectFloor identificationen_US
dc.subjectError ellipseen_US
dc.titleMap-assisted 3D indoor localization using crowd-sensing-based trajectory data and error ellipse-enhanced fusionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume14en_US
dc.identifier.issue18en_US
dc.identifier.doi10.3390/rs14184636en_US
dcterms.abstractCrowd-sensing-based localization is regarded as an effective method for providing indoor location-based services in large-scale urban areas. The performance of the crowd-sensing approach is subject to the poor accuracy of collected daily-life trajectories and the efficient combination of different location sources and indoor maps. This paper proposes a robust map-assisted 3D Indoor localization framework using crowd-sensing-based trajectory data and error ellipse-enhanced fusion (ML-CTEF). In the off-line phase, novel inertial odometry which contains the combination of 1D-convolutional neural networks (1D-CNN) and Bi-directional Long Short-Term Memory (Bi-LSTM)-based walking speed estimator is proposed for accurate crowd-sensing trajectories data pre-processing under different handheld modes. The Bi-LSTM network is further applied for floor identification, and the indoor network matching algorithm is adopted for the generation of fingerprinting database without pain. In the online phase, an error ellipse-assisted particle filter is proposed for the intelligent integration of inertial odometry, crowdsourced Wi-Fi fingerprinting, and indoor map information. The experimental results prove that the proposed ML-CTEF realizes autonomous and precise 3D indoor localization performance under complex and large-scale indoor environments; the estimated average positioning error is within 1.01 m in a multi-floor contained indoor building.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationRemote sensing, Sept 2022, v. 14, no. 18, 4636en_US
dcterms.isPartOfRemote sensingen_US
dcterms.issued2022-09-
dc.identifier.scopus2-s2.0-85138771842-
dc.identifier.eissn2072-4292en_US
dc.identifier.artn4636en_US
dc.description.validate202307 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOS-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextState Bureau of Surveying and Mapping; Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Wan_Map-Assisted_3D_Indoor.pdf5.25 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

66
Citations as of Apr 14, 2025

Downloads

32
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

6
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

4
Citations as of Jan 9, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.