Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/99930
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Rehabilitation Sciences | - |
| dc.creator | Ye, J | en_US |
| dc.creator | Jiang, H | en_US |
| dc.creator | Zhong, J | en_US |
| dc.date.accessioned | 2023-07-26T05:49:07Z | - |
| dc.date.available | 2023-07-26T05:49:07Z | - |
| dc.identifier.uri | http://hdl.handle.net/10397/99930 | - |
| dc.language.iso | en | en_US |
| dc.publisher | MDPI | en_US |
| dc.rights | © 2023 by the authors. Licensee MDPI, Basel, Switzerland. | en_US |
| dc.rights | This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). | en_US |
| dc.rights | The following publication Ye J, Jiang H, Zhong J. A Graph-Attention-Based Method for Single-Resident Daily Activity Recognition in Smart Homes. Sensors. 2023; 23(3):1626 is available at https://doi.org/10.3390/s23031626. | en_US |
| dc.subject | Human activity recognition | en_US |
| dc.subject | Smart home | en_US |
| dc.subject | Embedding | en_US |
| dc.subject | Graph attention network | en_US |
| dc.subject | Deep learning | en_US |
| dc.title | A graph-attention-based method for single-resident daily activity recognition in smart homes | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 23 | en_US |
| dc.identifier.issue | 3 | en_US |
| dc.identifier.doi | 10.3390/s23031626 | en_US |
| dcterms.abstract | In ambient-assisted living facilitated by smart home systems, the recognition of daily human activities is of great importance. It aims to infer the household’s daily activities from the triggered sensor observation sequences with varying time intervals among successive readouts. This paper introduces a novel deep learning framework based on embedding technology and graph attention networks, namely the time-oriented and location-oriented graph attention (TLGAT) networks. The embedding technology converts sensor observations into corresponding feature vectors. Afterward, TLGAT provides a sensor observation sequence as a fully connected graph to the model’s temporal correlation as well as the sensor’s location correlation among sensor observations and facilitates the feature representation of each sensor observation through receiving other sensor observations and weighting operations. The experiments were conducted on two public datasets, based on the diverse setups of sensor event sequence length. The experimental results revealed that the proposed method achieved favorable performance under diverse setups. | - |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Sensors, Feb. 2023, v. 23, no. 3, 1626 | en_US |
| dcterms.isPartOf | Sensors | en_US |
| dcterms.issued | 2023-02 | - |
| dc.identifier.scopus | 2-s2.0-85147893526 | - |
| dc.identifier.pmid | 36772666 | - |
| dc.identifier.eissn | 1424-8220 | en_US |
| dc.identifier.artn | 1626 | en_US |
| dc.description.validate | 202307 bcch | - |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | OA_Scopus/WOS | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | German Academic Exchange Service of Germany; National Natural Science Foundation of China | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.oaCategory | CC | en_US |
| Appears in Collections: | Conference Paper | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Ye_Graph-Attention-Based_Method_Single-Resident.pdf | 1.44 MB | Adobe PDF | View/Open |
Page views
149
Last Week
13
13
Last month
Citations as of Nov 9, 2025
Downloads
63
Citations as of Nov 9, 2025
SCOPUSTM
Citations
8
Citations as of Dec 19, 2025
WEB OF SCIENCETM
Citations
7
Citations as of Dec 18, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



