Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114220
PIRA download icon_1.1View/Download Full Text
Title: Gesture recognition for engaging spatial experiences in healthcare : co-design of intelligent interactive illuminative textiles
Authors: Lee, C 
Tan, J 
Tang, HT 
Tan, JJ
Yip, WK 
Tse, KW 
Issue Date: 2025
Source: International journal of AI for materials and design, 2025, v. 2, no. 3, p. 45-63
Abstract: The integration of artificial intelligence (AI) into textile design enhances functionality, automation, and user interaction. While gesture recognition has been explored in smart textiles, contactless interactive systems for healthcare remain underdeveloped. This study presents a human-centered co-design approach to the development of an AI-integrated gesture recognition system embedded in illuminative textile wall panels, aimed at enhancing spatial engagement in healthcare environments. The research was conducted in three key stages. First, a co-design workshop was conducted to explore user preferences in textile materials, graphic design, and gesture interaction. Second, intelligent illuminative textiles were developed by knitting polymeric optical fiber into base wool yarns to enable illumination. A camera was embedded and integrated with a computer vision-based deep learning model for detecting landmarks on the hands, shoulders, and head. The recognized gestures and body movements triggered specific pre-programmed color changes on the textile surface through edge-integrated light-emitting diodes. Finally, a prototype was fabricated and installed in a government-established District Health Centre in Hong Kong to support physical activity and rehabilitation for elderly users. Semi-structured interviews with stakeholders – including co-designers, users, and occupational therapists – were conducted to evaluate usability and inform design refinements. Stakeholders reported high levels of satisfaction, emphasizing the system’s ability to enhance community connection, therapeutic engagement, intuitive usability, and compelling visual feedback. These findings suggest that AI-driven interactive textiles present promising opportunities for rehabilitation, therapeutic environments, and the promotion of elderly well-being.
Graphical abstract: [Figure not available: see fulltext.]
Keywords: Deep learning
Gesture recognition
Healthcare
Human-artificial intelligence interaction
Illuminative textiles
Interactive textiles
Publisher: AccScience Publishing
Journal: International journal of AI for materials and design 
ISSN: 3041-0746
EISSN: 3029-2573
DOI: 10.36922/IJAMD025170013
Rights: Copyright: © 2025 Author(s). This is an Open-Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), permitting distribution, and reproduction in any medium, provided the original work is properly cited.
The following publication Ching Lee, Jeanne Tan, Hiu Ting Tang, Jun Jong Tan, Wing Ki Yip, Ka Wing Tse. Gesture recognition for engaging spatial experiences in healthcare: Co-design of intelligent interactive illuminative textiles. International Journal of AI for Materials and Design 2025, 2(3), 45-63 is available at https://doi.org/10.36922/IJAMD025170013.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
manuscript_ijamd05193.pdf3.83 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.