Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109922
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Health Technology and Informatics-
dc.creatorKo, ZYG-
dc.creatorLi, Y-
dc.creatorLiu, J-
dc.creatorJi, H-
dc.creatorQiu, A-
dc.creatorChen, N-
dc.date.accessioned2024-11-20T07:30:22Z-
dc.date.available2024-11-20T07:30:22Z-
dc.identifier.urihttp://hdl.handle.net/10397/109922-
dc.language.isoenen_US
dc.publisherElsevier BVen_US
dc.rights© 2024 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Ko, Z. Y. G., Li, Y., Liu, J., Ji, H., Qiu, A., & Chen, N. (2024). DOTnet 2.0: Deep learning network for diffuse optical tomography image reconstruction. Intelligence-Based Medicine, 9, 100133 is available at https://doi.org/10.1016/j.ibmed.2023.100133.en_US
dc.subjectBreast canceren_US
dc.subjectConvolutional neural networken_US
dc.subjectDeep learningen_US
dc.subjectDiffuse optical tomographyen_US
dc.subjectImage reconstructionen_US
dc.titleDOTnet 2.0 : deep learning network for diffuse optical tomography image reconstructionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume9-
dc.identifier.doi10.1016/j.ibmed.2023.100133-
dcterms.abstractBreast cancer is the most common cancer worldwide. The standard imaging modality for breast cancer screening is X-ray mammography, which suffers from low sensitivities in women with dense breasts and can potentially cause cancers despite a low radiation dosage. Diffuse Optical Tomography (DOT) is a noninvasive imaging technique that can potentially be employed to improve breast cancer early detection. However, conventional model-based algorithms for reconstructing DOT images usually produce low-quality images with limited resolution and low reconstruction accuracy. We propose to integrate deep neural networks (DNNs) with the conventional DOT reconstruction methods. This hybrid framework significantly enhances image quality. The DNNs have been trained and tested with sample data derived from clinically relevant breast models. The sample dataset contains blood vessel structures from breast structures and artificially created vessels using the Lindenmayer-system algorithm. By comparing the hybrid reconstruction with the ground truth image, we demonstrated a multi scale - structural similarity index measure (MS-SSIM) score of 0.80–0.90. Whereas using conventional reconstruction, MS-SSIM provided a much inferior score of 0.36–0.59. In terms of DOT image quality, both qualitative and quantitative assessments of the reconstructed images signify that the hybrid approach is superior to conventional methods. This improvement suggests that DOT can potentially become a viable alternative to breast cancer screening, providing a step towards the next-generation device for optical mammography.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIntelligence-based medicine, 2024, 9, 100133-
dcterms.isPartOfIntelligence-based medicine-
dcterms.issued2024-
dc.identifier.scopus2-s2.0-85185392166-
dc.identifier.eissn2666-5212-
dc.identifier.artn100133-
dc.description.validate202411 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextSingapore Ministry of Education (MOE) Academic Research Grants; Science and Technology Project of Jiangsu Province Granten_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
1-s2.0-S2666521223000479-main.pdf6.1 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

2
Citations as of Nov 24, 2024

Downloads

3
Citations as of Nov 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.