Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/106913
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.creatorHu, Sen_US
dc.creatorJian, Men_US
dc.creatorWang, Gen_US
dc.creatorWang, Yen_US
dc.creatorPan, Zen_US
dc.creatorLam, KMen_US
dc.date.accessioned2024-06-07T00:58:51Z-
dc.date.available2024-06-07T00:58:51Z-
dc.identifier.isbn978-1-5106-3835-8en_US
dc.identifier.isbn978-1-5106-3836-5 (electronic)en_US
dc.identifier.issn0277-786Xen_US
dc.identifier.urihttp://hdl.handle.net/10397/106913-
dc.descriptionInternational Workshop on Advanced Imaging Technology (IWAIT) 2020, 5-7 January 2020, Yogyakarta, Indonesiaen_US
dc.language.isoenen_US
dc.publisherSPIE - International Society for Optical Engineeringen_US
dc.rights© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this publication for a fee or for commercial purposes, and modification of the contents of the publication are prohibited.en_US
dc.rightsThe following publication Shiyu Hu, Muwei Jian, Guodong Wang, Yanjie Wang, Zhenkuan Pan, and Kin-Man Lam "Deep skip connection and multi-deconvolution network for single image super-resolution", Proc. SPIE 11515, International Workshop on Advanced Imaging Technology (IWAIT) 2020 is available at https://doi.org/10.1117/12.2567030.en_US
dc.subjectConvolutional neural networken_US
dc.subjectDeep skip connectionen_US
dc.subjectMulti-deconvolution layersen_US
dc.subjectPeak signal-to-noise ratioen_US
dc.subjectSuper-resolutionen_US
dc.titleDeep skip connection and multi-deconvolution network for single image super-resolutionen_US
dc.typeConference Paperen_US
dc.identifier.volume11515en_US
dc.identifier.doi10.1117/12.2567030en_US
dcterms.abstractIn this paper, we propose an efficient single image super-resolution (SR) method for multi-scale image texture recovery, based on Deep Skip Connection and Multi-Deconvolution Network. Our proposed method focuses on enhancing the expression capability of the convolutional neural network, so as to significantly improve the accuracy of the reconstructed higher-resolution texture details in images. The use of deep skip connection (DSC) can make full use of low-level information with the rich deep features. The multi-deconvolution layers (MDL) introduced can decrease the feature dimension, so this can reduce the computation required, caused by deepening the number of layers. All these features can reconstruct high-quality SR images. Experiment results show that our proposed method achieves state-of-the- art performance.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of SPIE : the International Society for Optical Engineering, 2020, v. 11515, 115152Xen_US
dcterms.isPartOfProceedings of SPIE : the International Society for Optical Engineeringen_US
dcterms.issued2020-
dc.identifier.scopus2-s2.0-85086632972-
dc.relation.conferenceInternational Workshop on Advanced Imaging Technology [IWAIT]en_US
dc.identifier.eissn1996-756Xen_US
dc.identifier.artn115152Xen_US
dc.description.validate202405 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberEIE-0245-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS26683665-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Lam_Deep_Skip_Connection.pdfPre-Published version1.46 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

85
Last Week
3
Last month
Citations as of Nov 9, 2025

Downloads

100
Citations as of Nov 9, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.