Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/117729
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Biomedical Engineering | en_US |
| dc.contributor | Research Institute for Sports Science and Technology | en_US |
| dc.contributor | Mainland Development Office | en_US |
| dc.creator | Zhao, X | en_US |
| dc.creator | Jin, Y | en_US |
| dc.creator | Wang, AY | en_US |
| dc.creator | Zhang, M | en_US |
| dc.date.accessioned | 2026-03-04T05:59:38Z | - |
| dc.date.available | 2026-03-04T05:59:38Z | - |
| dc.identifier.issn | 0306-4573 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/117729 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Pergamon Press | en_US |
| dc.subject | Journaling system | en_US |
| dc.subject | Large language models | en_US |
| dc.subject | Personal informatics | en_US |
| dc.subject | Physical activity | en_US |
| dc.subject | Post-exercise reflection | en_US |
| dc.title | From tracking to thinking : facilitating post-exercise reflection by a large language model-mediated journaling system | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 63 | en_US |
| dc.identifier.issue | 4 | en_US |
| dc.identifier.doi | 10.1016/j.ipm.2025.104574 | en_US |
| dcterms.abstract | Wearable devices provide rich quantitative data for self-reflection on physical activity. However, users often struggle to derive meaningful insights from these data, highlighting the need for enhanced support. To investigate whether Large Language Models (LLMs) can facilitate this process, we propose and evaluate a human-LLM collaborative reflective journaling paradigm. We developed PaceMind, an LLM-mediated journaling system that implements this paradigm based on a three-stage reflection framework. It can generate data-driven drafts and personalized questions to guide users in integrating exercise data with personal insights. A two-week within-subjects study ((Formula presented) ) compared the LLM-mediated system with a template-based journaling baseline. The LLM-mediated design significantly improved the perceived effectiveness of reflection support and increased users’ intention to use the system. However, perceived ease of use did not improve significantly. Users appreciated the LLM’s scaffolding for easing data sense-making, but also reported added cognitive work in verifying and personalizing the LLM-generated content. Although objective activity levels did not change significantly, the LLM-mediated condition showed a trend toward more adaptive exercise planning and sustained engagement. Our findings provide empirical evidence for a human-LLM collaborative reflection paradigm in a data-intensive exercise context. They highlight both the potential to deepen user reflection and underscore the critical design challenge of balancing automation with meaningful cognitive engagement and user control. | en_US |
| dcterms.accessRights | embargoed access | en_US |
| dcterms.bibliographicCitation | Information processing and management, June 2026, v. 63, no. 4, 104574 | en_US |
| dcterms.isPartOf | Information processing and management | en_US |
| dcterms.issued | 2026-06 | - |
| dc.identifier.scopus | 2-s2.0-105027541778 | - |
| dc.identifier.eissn | 1873-5371 | en_US |
| dc.identifier.artn | 104574 | en_US |
| dc.description.validate | 202603 bchy | en_US |
| dc.description.oa | Not applicable | en_US |
| dc.identifier.SubFormID | G001060/2026-02 | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | This study was sponsored by the Research Grants Council (RGC #15211322), Shenzhen Research Fund (JCYJ20230807140414029), and the Research Institute for Sports Science and Technology (RISports) in the Hong Kong Polytechnic University. | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.date.embargo | 2028-06-30 | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



