Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/118681
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Aeronautical and Aviation Engineering-
dc.creatorZhang, Z-
dc.creatorFang, L-
dc.creatorYan, Z-
dc.creatorChen, T-
dc.creatorWang, B-
dc.creatorWen, CY-
dc.date.accessioned2026-05-11T02:49:49Z-
dc.date.available2026-05-11T02:49:49Z-
dc.identifier.issn1083-4435-
dc.identifier.urihttp://hdl.handle.net/10397/118681-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Z. Zhang, L. Fang, Z. Yan, T. Chen, B. Wang and C. -y. Wen, 'Spatial–Temporal Diffusion Model for Underwater Scene Reconstruction With Application to AUV Navigation,' in IEEE/ASME Transactions on Mechatronics, vol. 30, no. 6, pp. 4142-4153, Dec. 2025 is available at https://doi.org/10.1109/TMECH.2025.3600436.en_US
dc.subjectAutonomous underwater vehicle (AUV)en_US
dc.subjectDiffusion modelen_US
dc.subjectScene reconstructionen_US
dc.subjectSubsea terrain perceptionen_US
dc.titleSpatial–temporal diffusion model for underwater scene reconstruction with application to AUV navigationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage4142-
dc.identifier.epage4153-
dc.identifier.volume30-
dc.identifier.issue6-
dc.identifier.doi10.1109/TMECH.2025.3600436-
dcterms.abstractautonomous underwater vehicles (AUVs) have been extensively utilized in subsea exploration and surveying. However, accurately perceiving the surrounding environment remains a significant challenge for AUVs due to the complexities of subsea terrains. To address this issue, we propose a novel generative scene reconstruction method to enhance AUVs’ perception capabilities. Our method is primarily designed for reconstructing dense subsea terrain from 3-D multibeam echosounder data. We leverage local diffusion and denoising strategies to reconstruct complete subsea terrain at the scene scale directly, without requiring normalization from point clouds. Considering the motion dynamics of AUVs and the overlap between consecutive sonar frames, we introduce a spatial–temporal attention mechanism to aggregate features from consecutive point clouds and guide the reconstruction process as a condition. Then, the reconstructed point cloud is utilized for probabilistic terrain modeling through Bayesian updating, enabling path planning. Experiments conducted on simulation and real-world datasets demonstrate that our method can generate more accurate and complete terrain maps. Furthermore, path planning based on our reconstruction method achieves the shortest and smoothest motion path, further validating that our reconstruction method can provide more complete perception information for AUV navigation.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE/ASME transactions on mechatronics, Dec. 2025, v. 30, no. 6, p. 4142-4153-
dcterms.isPartOfIEEE/ASME transactions on mechatronics-
dcterms.issued2025-12-
dc.identifier.scopus2-s2.0-105017084815-
dc.identifier.eissn1941-014X-
dc.description.validate202605 bcjz-
dc.description.oaAccepted Manuscripten_US
dc.identifier.SubFormIDG001605/2026-03en_US
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by the Young Scientists Fund of the National Natural Science Foundation of China under Grant 42301520, in part by the Major Research Project on Scientific Instrument Development of National Natural Science Foundation of China under Grant 42327901, in part by the Research Grants Council of Hong Kong under Grant 25206524, in part by the Innovation and Technology Fund under Grant PRP/068/23FX, in part by the Platform Project of Unmanned Autonomous Systems Research Centre under Grant P0049516, in part by the Guangdong-Hong Kong Joint Laboratory for Marine Infrastructure under Grant 2025B1212150001, and in part by the Seed Projects of Smart Cities Research Institute under Grant P0051028 and Grant P0054511.en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
dc.relation.rdatahttps://github.com/sam-zyzhang/SonarPC-Diff-
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Zhang_Spatial_Temporal_Diffusion.pdfPre-Published version5.59 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.