Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/89300
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorSchool of Nursingen_US
dc.contributorDepartment of Computingen_US
dc.creatorHang, Wen_US
dc.creatorLiang, Sen_US
dc.creatorChoi, KSen_US
dc.creatorChung, FLen_US
dc.creatorWang, Sen_US
dc.date.accessioned2021-03-08T04:12:48Z-
dc.date.available2021-03-08T04:12:48Z-
dc.identifier.urihttp://hdl.handle.net/10397/89300-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for Publishedertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication W. Hang, S. Liang, K. -S. Choi, F. -L. Chung and S. Wang, "Selective Transfer Classification Learning With Classification-Error-Based Consensus Regularization," in IEEE Transactions on Emerging Topics in Computational Intelligence, vol. 5, no. 2, pp. 178-190, April 2021 is available at https://dx.doi.org/10.1109/TETCI.2019.2892762.en_US
dc.subjectClassification-error-based consensus regularization (CCR)en_US
dc.subjectLeave-one-out cross-validationen_US
dc.subjectLeast square-support vector machine (LS-SVM)en_US
dc.subjectTransfer learningen_US
dc.titleSelective transfer classification learning with classification-error-based consensus regularizationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume5en_US
dc.identifier.issue2en_US
dc.identifier.doi10.1109/TETCI.2019.2892762en_US
dcterms.abstractTransfer learning methods are conventionally conducted by utilizing abundant labeled data in the source domain to build an accurate classifier for the target domain with scarce labeled data. However, most current transfer learning methods assume that all the source data are relevant to target domain, which may induce negative learning effect when the assumption becomes invalid as in many practical scenarios. To tackle this issue, the key is to accurately and quickly select the correlated source data and the corresponding weights. In this paper, we make use of the least square-support vector machine (LS-SVM) framework for identifying the correlated data and their weights from source domain. By keeping the consistency between the distributions of the classification errors of both the source and target domains, we first propose the classification-error-based consensus regularization (CCR), which can guarantee the performance improvement of the target classifier. Based on this approach, a novel CCR-based selective transfer classification learning method (CSTL) is then developed to autonomously and quickly choose the correlated source data and their weights to exploit the transferred knowledge by solving the LS-SVM based objective function. This method minimizes the leave-one-out cross-validation error despite scarce target training data. The advantages of the CSTL are demonstrated by evaluating its performance on public image and text datasets and comparing it with that of the state-of-the-art transfer learning methods.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on emerging topics in computational intelligence, Apr. 2021, v. 5, no. 2, p. 178-190en_US
dcterms.isPartOfIEEE transactions on emerging topics in computational intelligenceen_US
dcterms.issued2021-04-
dc.identifier.eissn2471-285Xen_US
dc.description.ros202103 bcrcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera0597-n20-
dc.identifier.SubFormID462-
dc.description.fundingSourceRGCen_US
dc.description.fundingTextPolyU 152040/16Een_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
a0597-n20_462.pdfPre-Published version2.36 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

86
Last Week
0
Last month
Citations as of Apr 21, 2024

Downloads

51
Citations as of Apr 21, 2024

WEB OF SCIENCETM
Citations

4
Citations as of Apr 25, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.