Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/89300
Title: | Selective transfer classification learning with classification-error-based consensus regularization | Authors: | Hang, W Liang, S Choi, KS Chung, FL Wang, S |
Issue Date: | Apr-2021 | Source: | IEEE transactions on emerging topics in computational intelligence, Apr. 2021, v. 5, no. 2, p. 178-190 | Abstract: | Transfer learning methods are conventionally conducted by utilizing abundant labeled data in the source domain to build an accurate classifier for the target domain with scarce labeled data. However, most current transfer learning methods assume that all the source data are relevant to target domain, which may induce negative learning effect when the assumption becomes invalid as in many practical scenarios. To tackle this issue, the key is to accurately and quickly select the correlated source data and the corresponding weights. In this paper, we make use of the least square-support vector machine (LS-SVM) framework for identifying the correlated data and their weights from source domain. By keeping the consistency between the distributions of the classification errors of both the source and target domains, we first propose the classification-error-based consensus regularization (CCR), which can guarantee the performance improvement of the target classifier. Based on this approach, a novel CCR-based selective transfer classification learning method (CSTL) is then developed to autonomously and quickly choose the correlated source data and their weights to exploit the transferred knowledge by solving the LS-SVM based objective function. This method minimizes the leave-one-out cross-validation error despite scarce target training data. The advantages of the CSTL are demonstrated by evaluating its performance on public image and text datasets and comparing it with that of the state-of-the-art transfer learning methods. | Keywords: | Classification-error-based consensus regularization (CCR) Leave-one-out cross-validation Least square-support vector machine (LS-SVM) Transfer learning |
Publisher: | Institute of Electrical and Electronics Engineers | Journal: | IEEE transactions on emerging topics in computational intelligence | EISSN: | 2471-285X | DOI: | 10.1109/TETCI.2019.2892762 | Rights: | © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for Publishedertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The following publication W. Hang, S. Liang, K. -S. Choi, F. -L. Chung and S. Wang, "Selective Transfer Classification Learning With Classification-Error-Based Consensus Regularization," in IEEE Transactions on Emerging Topics in Computational Intelligence, vol. 5, no. 2, pp. 178-190, April 2021 is available at https://dx.doi.org/10.1109/TETCI.2019.2892762. |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
a0597-n20_462.pdf | Pre-Published version | 2.36 MB | Adobe PDF | View/Open |
Page views
51
Last Week
0
0
Last month
Citations as of Sep 24, 2023
Downloads
27
Citations as of Sep 24, 2023
WEB OF SCIENCETM
Citations
4
Citations as of Sep 28, 2023

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.