Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/21715
Title: Sparsely encoded local descriptor for face verification
Authors: Cui, Z
Shan, S
Wang, R
Zhang, L 
Chen, X
Keywords: Face verification
Labeled faces in the wild
Local descriptor
Non-negativity
Sparse coding
Issue Date: 2015
Publisher: Elsevier
Source: Neurocomputing, 2015, v. 147, no. 1, p. 403-411 How to cite?
Journal: Neurocomputing 
Abstract: A novel Sparsely Encoded Local Descriptor (SELD) is proposed for face verification. Different from traditional hard or soft quantization methods, we exploit linear regression (LR) model with sparsity and non-negativity constraints to extract more discriminative features (i.e. sparse codes) from local image patches sampled pixel-wisely. Sum-pooling is then imposed to integrate all the sparse codes within each block partitioned from the whole face image. Whitened Principal Component Analysis (WPCA) is finally used to suppress noises and reduce the dimensionality of the pooled features, which thus results in the so-called SELD. To validate the proposed method, comprehensive experiments are conducted on face verification task to compare SELD with the existing related methods in terms of three variable component modules: K-means or K-SVD for dictionary learning, hard/soft assignment or regression model for encoding, as well as sum-pooling or max-pooling for pooling. Experimental results show that our method achieves a competitive accuracy compared with the state-of-the-art methods on the challenging Labeled Faces in the Wild (LFW) database.
URI: http://hdl.handle.net/10397/21715
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2014.06.044
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

1
Last Week
0
Last month
0
Citations as of Mar 29, 2017

Page view(s)

25
Last Week
1
Last month
Checked on Mar 26, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.