Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107109
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.creatorLiu, Ten_US
dc.creatorLam, KMen_US
dc.date.accessioned2024-06-13T01:03:58Z-
dc.date.available2024-06-13T01:03:58Z-
dc.identifier.isbn978-1-7281-8808-9 (Electronic)en_US
dc.identifier.isbn978-1-7281-8809-6 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/107109-
dc.description2020 25th International Conference on Pattern Recognition (ICPR), 10-15 January 2021, Milan, Italyen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights©2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication T. Liu and K. -M. Lam, "Flow-guided Spatial Attention Tracking for Egocentric Activity Recognition," 2020 25th International Conference on Pattern Recognition (ICPR), Milan, Italy, 2021, pp. 4303-4308 is available at https://doi.org/10.1109/ICPR48806.2021.9412512.en_US
dc.titleFlow-guided spatial attention tracking for egocentric activity recognitionen_US
dc.typeConference Paperen_US
dc.identifier.spage4303en_US
dc.identifier.epage4308en_US
dc.identifier.doi10.1109/ICPR48806.2021.9412512en_US
dcterms.abstractThe popularity of wearable cameras has opened up a new dimension for egocentric activity recognition. While some methods introduce attention mechanisms into deep learning networks to capture fine-grained hand-object interactions, they often neglect exploring the spatio-temporal relationships. Generating spatial attention, without adequately exploiting temporal consistency, will result in potentially sub-optimal performance in the video-based task. In this paper, we propose a flow-guided spatial attention tracking (F-SAT) module, which is based on enhancing motion patterns and inter-frame information, to highlight the discriminative features from regions of interest across a video sequence. A new form of input, namely the optical-flow volume, is presented to provide informative cues from moving parts for spatial attention tracking. The proposed F-SAT module is deployed to a two-branch-based deep architecture, which fuses complementary information for egocentric activity recognition. Experimental results on three egocentric activity benchmarks show that the proposed method achieves state-of-the-art performance.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of 2020 25th International Conference on Pattern Recognition (ICPR), 10-15 January 2021, Milan, Italy, p. 4303-4308en_US
dcterms.issued2021-
dc.identifier.scopus2-s2.0-85110443063-
dc.relation.conferenceInternational Conference on Pattern Recognition [ICPR]en_US
dc.description.validate202404 bckwen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberEIE-0095-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS55022139-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Liu_Flow-Guided_Spatial_Attention.pdfPre-Published version1.13 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

104
Last Week
6
Last month
Citations as of Nov 9, 2025

Downloads

97
Citations as of Nov 9, 2025

SCOPUSTM   
Citations

3
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

2
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.