Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105696
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorXie, Qen_US
dc.creatorZhao, Qen_US
dc.creatorMeng, Den_US
dc.creatorXu, Zen_US
dc.creatorGu, Sen_US
dc.creatorZuo, Wen_US
dc.creatorZhang, Len_US
dc.date.accessioned2024-04-15T07:35:57Z-
dc.date.available2024-04-15T07:35:57Z-
dc.identifier.isbn978-1-4673-8851-1 (Electronic)en_US
dc.identifier.isbn978-1-4673-8852-8 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/105696-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Q. Xie et al., "Multispectral Images Denoising by Intrinsic Tensor Sparsity Regularization," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 1692-1700 is available at https://doi.org/10.1109/CVPR.2016.187.en_US
dc.titleMultispectral images denoising by intrinsic tensor sparsity regularizationen_US
dc.typeConference Paperen_US
dc.identifier.spage1692en_US
dc.identifier.epage1700en_US
dc.identifier.doi10.1109/CVPR.2016.187en_US
dcterms.abstractMultispectral images (MSI) can help deliver more faithful representation for real scenes than the traditional image system, and enhance the performance of many computer vision tasks. In real cases, however, an MSI is always corrupted by various noises. In this paper, we propose a new tensor-based denoising approach by fully considering two intrinsic characteristics underlying an MSI, i.e., the global correlation along spectrum (GCS) and nonlocal self-similarity across space (NSS). In specific, we construct a new tensor sparsity measure, called intrinsic tensor sparsity (ITS) measure, which encodes both sparsity insights delivered by the most typical Tucker and CANDECOMP/ PARAFAC (CP) low-rank decomposition for a general tensor. Then we build a new MSI denoising model by applying the proposed ITS measure on tensors formed by non-local similar patches within the MSI. The intrinsic GCS and NSS knowledge can then be efficiently explored under the regularization of this tensor sparsity measure to finely rectify the recovery of a MSI from its corruption. A series of experiments on simulated and real MSI denoising problems show that our method outperforms all state-of-the-arts under comprehensive quantitative performance measures.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitation2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 26 June - 1 July 2016, Las Vegas, Nevada, p. 1692-1700en_US
dcterms.issued2016-
dc.identifier.scopus2-s2.0-84986292332-
dc.relation.conferenceIEEE Conference on Computer Vision and Pattern Recognition [CVPR]-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-1385-
dc.description.fundingSourceOthersen_US
dc.description.fundingText973 Program of China; NSFC projectsen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS13932511-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Gu_Multispectral_Images_Denoising.pdfPre-Published version1.43 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

10
Citations as of May 12, 2024

Downloads

1
Citations as of May 12, 2024

SCOPUSTM   
Citations

213
Citations as of May 17, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.