Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115601
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.contributorResearch Centre for Deep Space Explorations-
dc.creatorMa, Y-
dc.creatorLi, Z-
dc.creatorWu, B-
dc.creatorDuan, R-
dc.date.accessioned2025-10-08T01:16:56Z-
dc.date.available2025-10-08T01:16:56Z-
dc.identifier.urihttp://hdl.handle.net/10397/115601-
dc.language.isoenen_US
dc.publisherAmerican Geophysical Unionen_US
dc.rights© 2025 The Author(s).en_US
dc.rightsThis is an open access article under the terms of the Creative Commons Attribution‐NonCommercial License (http://creativecommons.org/licenses/by-nc/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.en_US
dc.rightsThe following publication Ma, Y., Li, Z., Wu, B., & Duan, R. (2025). DepthFormer: Depth-enhanced transformer network for semantic segmentation of the Martian surface from rover images. Earth and Space Science, 12, e2024EA003812 is available at https://doi.org/10.1029/2024EA003812.en_US
dc.titleDepthFormer : depth-enhanced transformer network for semantic segmentation of the Martian surface from rover imagesen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume12-
dc.identifier.issue6-
dc.identifier.doi10.1029/2024EA003812-
dcterms.abstractThe Martian surface, with its diverse landforms that reflect the planet's evolution, has attracted increasing scientific interest. While extensive data is needed for interpretation, identifying landform types is crucial. This semantic information reveals underlying features and patterns, offering valuable scientific insights. Advanced deep learning techniques, particularly Transformers, can enhance semantic segmentation and image interpretation, deepening our understanding of Martian surface features. However, current publicly available neural networks are trained in the context of Earth, rendering the direct use of the Martian surface impossible. Besides, the Martian surface features poorly texture and homogenous scenarios, leading to difficulty in segmenting the images into favorable semantic classes. In this paper, an innovative depth-enhanced Transformer network—DepthFormer is developed for the semantic segmentation of Martian surface images. The stereo images acquired by the Zhurong rover along its traverse are used for training and testing the DepthFormer network. Different from regular deep-learning networks only dealing with three bands (red, green and blue) of images, the DepthFormer incorporates the depth information available from the stereo images as the fourth band in the network to enable more accurate segmentation of various surface features. Experimental evaluations and comparisons using synthesized and actual Mars image data sets reveal that the DepthFormer achieves an average accuracy of 98%, superior to that of conventional segmentation methods. The proposed method is the first deep-learning model incorporating depth information for accurate semantic segmentation of the Martian surface, which is of significance for future Mars exploration missions and scientific studies.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationEarth and space science, June 2025, v. 16, no. 2, e2024EA003812-
dcterms.isPartOfEarth and space science-
dcterms.issued2025-06-
dc.identifier.scopus2-s2.0-105008205731-
dc.identifier.eissn2333-5084-
dc.identifier.artne2024EA003812-
dc.description.validate202510 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_TAen_US
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported by grants from the Research Grants Council of Hong Kong (Project PolyU 15210520, Project PolyU 15215822, Project PolyU 15236524, RIF Project R5043-19, CRF Project C7004-21GF). The authors would like to thank all those who worked on the archive of the data sets to make them publicly available.en_US
dc.description.pubStatusPublisheden_US
dc.description.TAWiley (2025)en_US
dc.description.oaCategoryTAen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Ma_DepthFormer_Depth_Enhanced.pdf3.92 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.