Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/65010
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorHu, H-
dc.creatorChen, C-
dc.creatorWu, BO-
dc.creatorYang, X-
dc.creatorZhu, Q-
dc.creatorDing, Y-
dc.date.accessioned2017-04-11T01:15:51Z-
dc.date.available2017-04-11T01:15:51Z-
dc.identifier.issn2194-9042 (print)en_US
dc.identifier.urihttp://hdl.handle.net/10397/65010-
dc.descriptionXXIII ISPRS Congress, 12-19 July 2016, Prague, Czech Republicen_US
dc.language.isoenen_US
dc.publisherCopernicus Publicationsen_US
dc.rights© Author(s) 2016. This is an open access article distributed under the Creative Commons Attribution 3.0 License (https://creativecommons.org/licenses/by/3.0/), which permits unen_US
dc.rightsThe following publication: Hu, H., Chen, C., Wu, B., Yang, X., Zhu, Q., and Ding, Y.: TEXTURE-AWARE DENSE IMAGE MATCHING USING TERNARY CENSUS TRANSFORM, ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-3, 59-66 s available at https://doi.org/10.5194/isprs-annals-III-3-59-2016, 2016.en_US
dc.subjectDense image matchingen_US
dc.subjectTexture awareen_US
dc.subjectCensus transformen_US
dc.subjectLocal ternary patternen_US
dc.subjectSGMen_US
dc.subjectMatching costen_US
dc.titleTexture-aware dense image matching using ternary census transformen_US
dc.typeConference Paperen_US
dc.identifier.spage59en_US
dc.identifier.epage66en_US
dc.identifier.volumeIII-3en_US
dc.identifier.doi10.5194/isprs-annals-III-3-59-2016en_US
dcterms.abstractTextureless and geometric discontinuities are major problems in state-of-the-art dense image matching methods, as they can cause visually significant noise and the loss of sharp features. Binary census transform is one of the best matching cost methods but in textureless areas, where the intensity values are similar, it suffers from small random noises. Global optimization for disparity computation is inherently sensitive to parameter tuning in complex urban scenes, and must compromise between smoothness and discontinuities. The aim of this study is to provide a method to overcome these issues in dense image matching, by extending the industry proven Semi-Global Matching through 1) developing a ternary census transform, which takes three outputs in a single order comparison and encodes the results in two bits rather than one, and also 2) by using texture-information to self-tune the parameters, which both preserves sharp edges and enforces smoothness when necessary. Experimental results using various datasets from different platforms have shown that the visual qualities of the triangulated point clouds in urban areas can be largely improved by these proposed methods.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationISPRS annals of the photogrammetry, remote sensing and spatial information sciences, 2016, v. III-3, p. 59-66-
dcterms.isPartOfISPRS annals of the photogrammetry, remote sensing and spatial information sciences-
dcterms.issued2016-
dc.identifier.isiWOS:000391012700008-
dc.relation.conferenceISPRS Congressen_US
dc.identifier.rosgroupid2015003393-
dc.description.ros2015-2016 > Academic research: refereed > Publication in refereed journalen_US
dc.description.validate201811_a bcmaen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_IR/PIRAen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Hu_Texture-aware_dense_image.pdf1.2 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

254
Last Week
7
Last month
Citations as of Apr 14, 2024

Downloads

150
Citations as of Apr 14, 2024

SCOPUSTM   
Citations

26
Last Week
0
Last month
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

22
Last Week
0
Last month
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.