Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107163
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.creatorLai, SCen_US
dc.creatorKong, Men_US
dc.creatorLam, KMen_US
dc.creatorLi, Den_US
dc.date.accessioned2024-06-13T01:04:18Z-
dc.date.available2024-06-13T01:04:18Z-
dc.identifier.isbn978-1-5386-6249-6 (Electronic)en_US
dc.identifier.isbn978-1-5386-6250-2 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/107163-
dc.description2019 IEEE International Conference on Image Processing (ICIP), 22-25 September 2019, Taipei, Taiwanen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights©2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication S. -C. Lai, M. Kong, K. -M. Lam and D. Li, "High-Resolution Face Recognition Via Deep Pore-Feature Matching," 2019 IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan, 2019, pp. 3477-3481 is available at https://doi.org/10.1109/ICIP.2019.8803686.en_US
dc.subjectFace recognitionen_US
dc.subjectFeature extractionen_US
dc.subjectHigh-resolution face recognitionen_US
dc.subjectPore-scale facial featureen_US
dc.titleHigh-resolution face recognition via deep pore-feature matchingen_US
dc.typeConference Paperen_US
dc.identifier.spage3477en_US
dc.identifier.epage3481en_US
dc.identifier.doi10.1109/ICIP.2019.8803686en_US
dcterms.abstractBecause of the advancement of capturing devices, both image resolution and image quality have been significantly improved. Efficiently utilizing facial information is beneficial in enhancing the performance of face recognition methods. For high-resolution face images, pore-scale facial features can be observed. The positions and local patterns of pore features are biologically discriminative, so they can be explored for face identification. In this paper, we extend the previous work on pore-scale features, by proposing a new learning-based descriptor, namely PoreNet. Experiment results show that our proposed descriptor achieves an excellent performance on two high-resolution face datasets, namely Bosphorus and MultiPIE. More importantly, our proposed method significantly outperforms the state-of-the-art Convolutional Neural Network (CNN)-based face recognition method, when query faces are highly occluded. The code of our proposed method is available at: https://github.com/johnnysclai/PoreNet.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of 2019 IEEE International Conference on Image Processing (ICIP), 22-25 September 2019, Taipei, Taiwan, p. 3477-3481en_US
dcterms.issued2019-
dc.identifier.scopus2-s2.0-85076808677-
dc.relation.conferenceIEEE International Conference on Image Processing [ICIP]en_US
dc.description.validate202404 bckwen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberEIE-0314-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextHong Kong SAR Governmenten_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20082323-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Lai_High-Resolution_Face_Recognition.pdfPre-Published version964.59 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

1
Citations as of Jun 30, 2024

SCOPUSTM   
Citations

6
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

4
Citations as of Jun 27, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.