Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/32620
Title: Entropy controlled Laplacian regularization for least square regression
Authors: Wang, X
Tao, D
Li, Z
Keywords: Face recognition
Least squares
Manifold learning
Issue Date: 2010
Publisher: Elsevier
Source: Signal processing, 2010, v. 90, no. 6, p. 2043-2049 How to cite?
Journal: Signal processing 
Abstract: Least square regression (LSR) is popular in pattern classification. Compared against other matrix factorization based methods, it is simple yet efficient. However, LSR ignores unlabeled samples in the training stage, so the regression error could be large when the labeled samples are insufficient. To solve this problem, the Laplacian regularization can be used to penalize LSR. Extensive theoretical and experimental results have confirmed the validity of Laplacian regularized least square (LapRLS). However, multiple hyper-parameters have been introduced to estimate the intrinsic manifold induced by the regularization, and thus the time consuming cross-validation should be applied to tune these parameters. To alleviate this problem, we assume the intrinsic manifold is a linear combination of a given set of known manifolds. By further assuming the priors of the given manifolds are equivalent, we introduce the entropy maximization penalty to automatically learn the linear combination coefficients. The entropy maximization trades the smoothness off the complexity. Therefore, the proposed model enjoys the following advantages: (1) it is able to incorporate both labeled and unlabeled data into training process, (2) it is able to learn the manifold hyper-parameters automatically, and (3) it approximates the true probability distribution with respect to prescribed test data. To test the classification performance of our proposed model, we apply the model on three well-known human face datasets, i.e. FERET, ORL, and YALE. Experimental results on these three face datasets suggest the effectiveness and the efficiency of the new model compared against the traditional LSR and the Laplacian regularized least squares.
URI: http://hdl.handle.net/10397/32620
ISSN: 0165-1684
EISSN: 1872-7557
DOI: 10.1016/j.sigpro.2010.01.006
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

15
Last Week
0
Last month
0
Citations as of Aug 11, 2017

WEB OF SCIENCETM
Citations

15
Last Week
0
Last month
0
Citations as of Aug 13, 2017

Page view(s)

72
Last Week
4
Last month
Checked on Aug 13, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.