Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115752
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.creatorSu, Yen_US
dc.creatorWang, Yen_US
dc.creatorChau, LPen_US
dc.date.accessioned2025-10-27T08:44:10Z-
dc.date.available2025-10-27T08:44:10Z-
dc.identifier.issn0957-4174en_US
dc.identifier.urihttp://hdl.handle.net/10397/115752-
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.subjectAttention mechanismen_US
dc.subjectEgocentricen_US
dc.subjectHand-object interactionen_US
dc.subjectRelationship modelingen_US
dc.titleCaRe-Ego : contact-aware relationship modeling for egocentric interactive hand-object segmentationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume296en_US
dc.identifier.doi10.1016/j.eswa.2025.129148en_US
dcterms.abstractEgocentric Interactive hand-object segmentation (EgoIHOS) requires segmenting hands and interacting objects in egocentric images, which is crucial for understanding human behaviors in assistive systems. Current methods often overlook the essential interactive relationships between hands and objects, or merely establish coarse hand-object associations to recognize targets, leading to suboptimal accuracy. To address this issue, we propose a novel CaRe-Ego method that achieves state-of-the-art performance by emphasizing contact between hands and objects from two aspects. First, to explicitly model hand-object interactive relationships, we introduce a Hand-guided Object Feature Enhancer (HOFE), which utilizes hand features as prior knowledge to extract more contact-relevant and distinguishing object features. Second, to promote the network concentrating on hand-object interactions, we design a Contact-Centric Object Decoupling Strategy (CODS) to reduce interference during training by disentangling the overlapping attributes of the segmentation targets, allowing the model to capture specific contact-aware features associated with each hand. Experiments on various in-domain and out-of-domain test sets show that Care-Ego significantly outperforms existing methods while exhibiting robust generalization capability. Codes are publicly available at https://github.com/yuggiehk/CaRe-Ego/.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationExpert systems with applications, 15 Jan. 2026, v. 296, pt. C, 129148en_US
dcterms.isPartOfExpert systems with applicationsen_US
dcterms.issued2026-01-15-
dc.identifier.scopus2-s2.0-105012041169-
dc.identifier.eissn1873-6793en_US
dc.identifier.artn129148en_US
dc.description.validate202510 bcchen_US
dc.description.oaNot applicableen_US
dc.identifier.SubFormIDG000294/2025-08-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThe research work was conducted in the JC STEM Lab of Machine Learning and Computer Vision funded by The Hong Kong Jockey Club Charities Trust.en_US
dc.description.pubStatusPublisheden_US
dc.date.embargo2028-01-15en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 2028-01-15
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.