Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/79621
Title: Regularized label relaxation linear regression
Authors: Fang, XZ
Xu, Y
Li, XL
Lai, ZH
Wong, WK 
Fang, BW
Keywords: Class compactness graph
Computer vision
Label relaxation
Linear regression (LR)
Manifold learning
Issue Date: 2018
Publisher: Institute of Electrical and Electronics Engineers
Source: IEEE transactions on neural networks and learning systems, Apr. 2018, v. 29, no. 4, p. 1006-1018 How to cite?
Journal: IEEE transactions on neural networks and learning systems 
Abstract: Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on l(2)-norm and l(2,1)-norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.
URI: http://hdl.handle.net/10397/79621
ISSN: 2162-237X
EISSN: 2162-2388
DOI: 10.1109/TNNLS.2017.2648880
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

3
Citations as of Jan 11, 2019

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.