Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/19510
Title: Relaxed collaborative representation for pattern classification
Authors: Yang, M
Zhang, L 
Zhang, D 
Wang, S
Issue Date: 2012
Source: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2012, p. 2224-2231 How to cite?
Abstract: Regularized linear representation learning has led to interesting results in image classification, while how the object should be represented is a critical issue to be investigated. Considering the fact that the different features in a sample should contribute differently to the pattern representation and classification, in this paper we present a novel relaxed collaborative representation (RCR) model to effectively exploit the similarity and distinctiveness of features. In RCR, each feature vector is coded on its associated dictionary to allow flexibility of feature coding, while the variance of coding vectors is minimized to address the similarity among features. In addition, the distinctiveness of different features is exploited by weighting its distance to other features in the coding domain. The proposed RCR is simple, while our extensive experimental results on benchmark image databases (e.g., various face and flower databases) show that it is very competitive with state-of-the-art image classification methods.
Description: 2012 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2012, Providence, RI, 16-21 June 2012
URI: http://hdl.handle.net/10397/19510
ISBN: 9781467312264
ISSN: 1063-6919
DOI: 10.1109/CVPR.2012.6247931
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

122
Last Week
0
Last month
4
Citations as of Sep 23, 2017

WEB OF SCIENCETM
Citations

75
Last Week
0
Last month
3
Citations as of Sep 22, 2017

Page view(s)

46
Last Week
1
Last month
Checked on Sep 18, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.