Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/115752
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Electrical and Electronic Engineering | en_US |
| dc.creator | Su, Y | en_US |
| dc.creator | Wang, Y | en_US |
| dc.creator | Chau, LP | en_US |
| dc.date.accessioned | 2025-10-27T08:44:10Z | - |
| dc.date.available | 2025-10-27T08:44:10Z | - |
| dc.identifier.issn | 0957-4174 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/115752 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Elsevier Ltd | en_US |
| dc.subject | Attention mechanism | en_US |
| dc.subject | Egocentric | en_US |
| dc.subject | Hand-object interaction | en_US |
| dc.subject | Relationship modeling | en_US |
| dc.title | CaRe-Ego : contact-aware relationship modeling for egocentric interactive hand-object segmentation | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 296 | en_US |
| dc.identifier.doi | 10.1016/j.eswa.2025.129148 | en_US |
| dcterms.abstract | Egocentric Interactive hand-object segmentation (EgoIHOS) requires segmenting hands and interacting objects in egocentric images, which is crucial for understanding human behaviors in assistive systems. Current methods often overlook the essential interactive relationships between hands and objects, or merely establish coarse hand-object associations to recognize targets, leading to suboptimal accuracy. To address this issue, we propose a novel CaRe-Ego method that achieves state-of-the-art performance by emphasizing contact between hands and objects from two aspects. First, to explicitly model hand-object interactive relationships, we introduce a Hand-guided Object Feature Enhancer (HOFE), which utilizes hand features as prior knowledge to extract more contact-relevant and distinguishing object features. Second, to promote the network concentrating on hand-object interactions, we design a Contact-Centric Object Decoupling Strategy (CODS) to reduce interference during training by disentangling the overlapping attributes of the segmentation targets, allowing the model to capture specific contact-aware features associated with each hand. Experiments on various in-domain and out-of-domain test sets show that Care-Ego significantly outperforms existing methods while exhibiting robust generalization capability. Codes are publicly available at https://github.com/yuggiehk/CaRe-Ego/. | en_US |
| dcterms.accessRights | embargoed access | en_US |
| dcterms.bibliographicCitation | Expert systems with applications, 15 Jan. 2026, v. 296, pt. C, 129148 | en_US |
| dcterms.isPartOf | Expert systems with applications | en_US |
| dcterms.issued | 2026-01-15 | - |
| dc.identifier.scopus | 2-s2.0-105012041169 | - |
| dc.identifier.eissn | 1873-6793 | en_US |
| dc.identifier.artn | 129148 | en_US |
| dc.description.validate | 202510 bcch | en_US |
| dc.description.oa | Not applicable | en_US |
| dc.identifier.SubFormID | G000294/2025-08 | - |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | The research work was conducted in the JC STEM Lab of Machine Learning and Computer Vision funded by The Hong Kong Jockey Club Charities Trust. | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.date.embargo | 2028-01-15 | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



