Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/118356
DC FieldValueLanguage
dc.contributorDepartment of Biomedical Engineeringen_US
dc.contributorResearch Institute for Sports Science and Technologyen_US
dc.contributorResearch Institute for Smart Ageingen_US
dc.creatorZha, Len_US
dc.creatorChen, Men_US
dc.creatorTam, AYCen_US
dc.creatorWong, DWCen_US
dc.creatorCheung, JCWen_US
dc.date.accessioned2026-04-09T02:57:09Z-
dc.date.available2026-04-09T02:57:09Z-
dc.identifier.urihttp://hdl.handle.net/10397/118356-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.subjectMultitask learningen_US
dc.subjectNon-contact sensingen_US
dc.subjectSleep monitoringen_US
dc.subjectUltra-wideband radaren_US
dc.titleIoT UWB-radar array system with two-stage multi-task deep network for bed occupancy and posture surveillance via spatial echo feature map and cross-modal visual explainabilityen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.doi10.1109/JIOT.2026.3674529en_US
dcterms.abstractAccurate, unobtrusive in-bed occupancy-posture IoT monitoring is critical both for timely detection of “missing patient” events”, reducing false alarms in nocturnal surveillance, and informing sleep posture-related health risk management. This paper presents a novel two-stage, multi-task, multiple ultra-wideband (UWB) radar IoT system for joint in-bed occupancy detection and fine-grained sleep posture classification, using an array of eight UWB radars. The technical novelty centers on three mechanisms: a View-Weighted Multi-Radar Network (VWM-Net) that adaptively fuses single-radar echo maps into a Spatial Echo Feature Map (SEFM); an integration of DenseNet and ConvNeXt2 (DCNX2); and a cross-modal generative model that translates SEFMs into human-interpretable depth posture images. Specifically, a UCYC-GAN is proposed by integrating a U-Net structure with a CycleGAN generator, and its visual utility is compared against NICE-GAN and BDBM. The system is trained and validated on a cohort of 100 participants across 10 fine-grained postures and 4 blanket conditions, supplemented with a life-sized dummy and cluttered objects to model non-human occupancy and environmental distractors. In the first stage, a dedicated classifier distinguishes humans from non-human entities with 99.09% accuracy. In the second stage, the multi-head DCNX2 backbone performed multi‑task classification, achieving 93.75% accuracy for 10 fine‑grained postures, 98% for four coarse‑grained postures, 88.25% for blanket‑coverage types, 98.13% for blanket presence, and 95.13% for participant gender. The overall can effectively mitigate false positives from environmental clutter, advances non-contact in-bed occupancy-posture surveillance, and, through a cloud-enabled web dashboard and APIs, supports real-time visualization and retrospective data analysis.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationIEEE internet of things journal, Date of Publication: 16 March 2026, Early Access, https://doi.org/10.1109/JIOT.2026.3674529en_US
dcterms.isPartOfIEEE internet of things journalen_US
dcterms.issued2026-
dc.identifier.eissn2327-4662en_US
dc.description.validate202604 bcchen_US
dc.description.oaNot applicableen_US
dc.identifier.FolderNumbera4362-
dc.identifier.SubFormID52638-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by General Research Fund (GRF) from the University Grants Committee of Hong Kong under Grants PolyU15223822, and in part by the Research Institute for Smart Ageing of The Hong Kong Polytechnic University under Grants P0039001.en_US
dc.description.pubStatusEarly releaseen_US
dc.date.embargo0000-00-00 (to be updated)en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.