Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/79034
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorHuang, MXen_US
dc.creatorLi, JJen_US
dc.creatorNgai, Gen_US
dc.creatorLeong, HVen_US
dc.creatorHua, KAen_US
dc.date.accessioned2018-10-26T01:22:11Z-
dc.date.available2018-10-26T01:22:11Z-
dc.identifier.issn1520-9210en_US
dc.identifier.urihttp://hdl.handle.net/10397/79034-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication M. X. Huang, J. Li, G. Ngai, H. V. Leong and K. A. Hua, "Fast-PADMA: Rapidly Adapting Facial Affect Model From Similar Individuals," in IEEE Transactions on Multimedia, vol. 20, no. 7, pp. 1901-1915, July 2018 is available at https://doi.org/10.1109/TMM.2017.2775206.en_US
dc.subjectAffective computingen_US
dc.subjectFacial affecten_US
dc.subjectRapid modelingen_US
dc.subjectUser-adaptive modelen_US
dc.titleFast-PADMA : rapidly adapting facial affect mode from similar individualsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1901en_US
dc.identifier.epage1915en_US
dc.identifier.volume20en_US
dc.identifier.issue7en_US
dc.identifier.doi10.1109/TMM.2017.2775206en_US
dcterms.abstractA user-specific model generally performs better in facial affect recognition. Existing solutions, however, have usability issues since the annotation can be long and tedious for the end users (e.g., consumers). We address this critical issue by presenting a more user-friendly user-adaptive model to make the personalized approach more practical. This paper proposes a novel user-adaptive model, which we have called fast-Personal Affect Detection with Minimal Annotation (Fast-PADMA). Fast-PADMA integrates data from multiple source subjects with a small amount of data from the target subject. Collecting this target subject data is feasible since fast-PADMA requires only one self-reported affect annotation per facial video segment. To alleviate overfitting in this context of limited individual training data, we propose an efficient bootstrapping technique, which strengthens the contribution of multiple similar source subjects. Specifically, we employ an ensemble classifier to construct pretrained weak generic classifiers from data of multiple source subjects, which is weighted according to the available data from the target user. The result is a model that does not require expensive computation, such as distribution dissimilarity calculation or model retraining. We evaluate our method with in-depth experimental evaluations on five publicly available facial datasets, with results that compare favorably with the state-of-the-art performance on classifying pain, arousal, and valence. Our findings show that fast-PADMA is effective at rapidly constructing a user-adaptive model that outperforms both its generic and user-specific counterparts. This efficient technique has the potential to significantly improve user-adaptive facial affect recognition for personal use and, therefore, enable comprehensive affect-aware applications.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on multimedia, July 2018, v. 20, no. 7, p. 1901-1915en_US
dcterms.isPartOfIEEE transactions on multimediaen_US
dcterms.issued2018-07-
dc.identifier.isiWOS:000435570100024-
dc.identifier.eissn1941-0077en_US
dc.identifier.rosgroupid2017005355-
dc.description.ros2017-2018 > Academic research: refereed > Publication in refereed journalen_US
dc.description.validate201810 bcrcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-0891-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS6802391-
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Li_Fast-Padma_Rapidly_Adapting.pdfPre-Published version1.63 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

69
Last Week
0
Last month
Citations as of Apr 21, 2024

Downloads

75
Citations as of Apr 21, 2024

SCOPUSTM   
Citations

2
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

2
Last Week
0
Last month
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.