Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115349
PIRA download icon_1.1View/Download Full Text
Title: Integrating emotional intelligence in design
Authors: Wang, SJ 
Issue Date: Sep-2025
Abstract: Professor Stephen Jia Wang’s research explores the potential of developing and integrating multimodal emotion recognition technologies into people’s daily lives to enhance their emotional intelligence and well-being. The research leverages advanced design and AI techniques to address the growing global concerns of distress, sadness and anxiety, as documented by Daly & Macchia (2010). It examines how interactive and intelligent systems design might effectively reduce stress by recognising and responding to subtle emotional expressions, such as micro-gestures (MGs). It seeks to understand users’ emotional support needs to establish the foundational parameters for a framework that guides the design of emotionally intelligent systems, with a specific focus on prioritising and stimulating emotional interactions.
Conducted between 2014 and 2023, this comprehensive literature review identifies a significant shift from traditional subjective emotion assessment methods to multimodal emotion recognition technologies. It highlights that reliance on single modalities limits robustness, especially in complex, real-life scenarios and emphasises three key advantages of multimodal methods:
1) Enhanced Accuracy: Simultaneous access to multiple emotional modalities (e.g. smiling and applauding when happy) enhances contextual understanding and accuracy (Baltrušaitis et al., 2018).
2) Complementary Strengths: Different multimodal inputs complement each other by addressing their respective limitations during testing (Xue et al., 2024).
3) Missing Data Compensation: Missing inputs from one modality can be compensated for by data from another (D'mello & Kory, 2015), such as recognising emotion from visual cues when audio cues are absent.
The Major Collaborative Output (MCO) encompasses five interrelated research projects funded by the Transport Department of Hong Kong’s Smart Traffic Fund and industry partners Huawei, GAC and Research Centre for Future (Caring) Mobility (RcFCM), with a total funding of HK$5.7M. These projects collectively address critical research questions (RQs) and provide foundational knowledge for a family of patents. Notable outcomes include the award winning ‘EmoSense’ technology and the ‘EmoFriends’ toolkit, which transform common plush toys into intelligent companion robots that provide real-time emotional support and healthier emotional environments.
Rights: All rights reserved.
Posted with permission of the author.
Appears in Collections:Creative Work

Files in This Item:
File Description SizeFormat 
Wang_Integrating_Emotional_Intelligence.pdf5.96 MBAdobe PDFView/Open
Open Access Information
Status open access
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.