Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/13984
Title: Face recognition based on local uncorrelated and weighted global uncorrelated discriminant transforms
Authors: Jing, X
Li, S
Zhang, D 
Yang, J
Keywords: Face recognition
Feature extraction
Local uncorrelated discriminant transform
Uncorrelated constraints
Weighted global uncorrelated discriminant transform
Issue Date: 2011
Source: Proceedings - International Conference on Image Processing, ICIP, 2011, p. 3049-3052 How to cite?
Abstract: Feature extraction is one of the most important problems in image recognition tasks. In many applications such as face recognition, it is desirable to eliminate the redundancy among the extracted discriminant features. In this paper, we propose two novel feature extraction approaches named local uncorrelated discriminant transform (LUDT) and weighted global uncorrelated discriminant transform (WGUDT) for face recognition, respectively. LUDT and WGUDT separately construct the local uncorrelated constraints and the weighted global uncorrelated constraints. Then they iteratively calculate the optimal discriminant vectors that maximize the Fisher criterion under the corresponding statistical uncorrelated constraints, respectively. The proposed LUDT and WGUDT approaches are evaluated on the public AR and FERET face databases. Experimental results demonstrate that the proposed approaches outperform several representative feature extraction methods.
Description: 2011 18th IEEE International Conference on Image Processing, ICIP 2011, Brussels, 11-14 September 2011
URI: http://hdl.handle.net/10397/13984
ISBN: 9781457713033
ISSN: 1522-4880
DOI: 10.1109/ICIP.2011.6116307
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

8
Last Week
0
Last month
0
Citations as of Sep 24, 2017

Page view(s)

39
Last Week
0
Last month
Checked on Sep 25, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.