Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105698
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorWang, Qen_US
dc.creatorLi, Pen_US
dc.creatorZuo, Wen_US
dc.creatorZhang, Len_US
dc.date.accessioned2024-04-15T07:35:58Z-
dc.date.available2024-04-15T07:35:58Z-
dc.identifier.isbn978-1-4673-8851-1 (Electronic)en_US
dc.identifier.isbn978-1-4673-8852-8 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/105698-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Q. Wang, P. Li, W. Zuo and L. Zhang, "RAID-G: Robust Estimation of Approximate Infinite Dimensional Gaussian with Application to Material Recognition," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 4433-4441 is available at https://doi.org/10.1109/CVPR.2016.480.en_US
dc.titleRAID-G : robust estimation of approximate infinite dimensional Gaussian with application to material recognitionen_US
dc.typeConference Paperen_US
dc.identifier.spage4433en_US
dc.identifier.epage4441en_US
dc.identifier.doi10.1109/CVPR.2016.480en_US
dcterms.abstractInfinite dimensional covariance descriptors can provide richer and more discriminative information than their low dimensional counterparts. In this paper, we propose a novel image descriptor, namely, robust approximate infinite dimensional Gaussian (RAID-G). The challenges of RAID-G mainly lie on two aspects: (1) description of infinite dimensional Gaussian is difficult due to its non-linear Riemannian geometric structure and the infinite dimensional setting, hence effective approximation is necessary, (2) traditional maximum likelihood estimation (MLE) is not robust to high (even infinite) dimensional covariance matrix in Gaussian setting. To address these challenges, explicit feature mapping (EFM) is first introduced for effective approximation of infinite dimensional Gaussian induced by additive kernel function, and then a new regularized MLE method based on von Neumann divergence is proposed for robust estimation of covariance matrix. The EFM and proposed regularized MLE allow a closed-form of RAID-G, which is very efficient and effective for high dimensional features. We extend RAID-G by using the outputs of deep convolutional neural networks as original features, and apply it to material recognition. Our approach is evaluated on five material benchmarks and one fine-grained benchmark. It achieves 84.9% accuracy on FMD and 86.3% accuracy on UIUC material database, which are much higher than state-of-the-arts.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitation2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 26 June - 1 July 2016, Las Vegas, Nevada, p. 4433-4441en_US
dcterms.issued2016-
dc.identifier.scopus2-s2.0-84986331499-
dc.relation.conferenceIEEE Conference on Computer Vision and Pattern Recognition [CVPR]-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-1388-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS13932293-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Zhang_Raid-G_Robust_Estimation.pdfPre-Published version1.86 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

8
Citations as of May 12, 2024

Downloads

1
Citations as of May 12, 2024

SCOPUSTM   
Citations

56
Citations as of May 17, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.