Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/87526
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorLi, Men_US
dc.creatorChen, Ren_US
dc.creatorLiao, Xen_US
dc.creatorGuo, Ben_US
dc.creatorZhang, Wen_US
dc.creatorGuo, Gen_US
dc.date.accessioned2020-07-16T03:57:54Z-
dc.date.available2020-07-16T03:57:54Z-
dc.identifier.urihttp://hdl.handle.net/10397/87526-
dc.language.isoenen_US
dc.publisherMolecular Diversity Preservation International (MDPI)en_US
dc.rights© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Li M, Chen R, Liao X, Guo B, Zhang W, Guo G. A Precise Indoor Visual Positioning Approach Using a Built Image Feature Database and Single User Image from Smartphone Cameras. Remote Sensing. 2020; 12(5):869, is available at https://doi.org/10.3390/rs12050869en_US
dc.subjectCamera poseen_US
dc.subjectFeature matchingen_US
dc.subjectIndoor visual positioningen_US
dc.subjectSmartphoneen_US
dc.subjectSURFen_US
dc.titleA precise indoor visual positioning approach using a built image feature database and single user image from smartphone camerasen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume12en_US
dc.identifier.issue5en_US
dc.identifier.doi10.3390/rs12050869en_US
dcterms.abstractIndoor visual positioning is a key technology in a variety of indoor location services and applications. The particular spatial structures and environments of indoor spaces is a challenging scene for visual positioning. To address the existing problems of low positioning accuracy and low robustness, this paper proposes a precision single-image-based indoor visual positioning method for a smartphone. The proposed method includes three procedures: First, color sequence images of the indoor environment are collected in an experimental room, from which an indoor precise-positioning-feature database is produced, using a classic speed-up robust features (SURF) point matching strategy and the multi-image spatial forward intersection. Then, the relationships between the smartphone positioning image SURF feature points and object 3D points are obtained by an efficient similarity feature description retrieval method, in which a more reliable and correct matching point pair set is obtained, using a novel matching error elimination technology based on Hough transform voting. Finally, efficient perspective-n-point (EPnP) and bundle adjustment (BA) methods are used to calculate the intrinsic and extrinsic parameters of the positioning image, and the location of the smartphone is obtained as a result. Compared with the ground truth, results of the experiments indicate that the proposed approach can be used for indoor positioning, with an accuracy of approximately 10 cm. In addition, experiments show that the proposed method is more robust and efficient than the baseline method in a real scene. In the case where sufficient indoor textures are present, it has the potential to become a low-cost, precise, and highly available indoor positioning technology.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationRemote sensing, 2020, v. 12, no. 5, 869en_US
dcterms.isPartOfRemote sensingen_US
dcterms.issued2020-
dc.identifier.isiWOS:000531559300124-
dc.identifier.scopus2-s2.0-85081904167-
dc.identifier.eissn2072-4292en_US
dc.identifier.artn869en_US
dc.description.validate202007 bcma-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Li_precise_indoor_visual.pdf4.76 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

43
Last Week
0
Last month
Citations as of May 12, 2024

Downloads

25
Citations as of May 12, 2024

SCOPUSTM   
Citations

22
Citations as of May 16, 2024

WEB OF SCIENCETM
Citations

16
Citations as of May 16, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.