Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105710
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorLi, Jen_US
dc.creatorNgai, Gen_US
dc.creatorLeong, HVen_US
dc.creatorChan, SCFen_US
dc.date.accessioned2024-04-15T07:36:03Z-
dc.date.available2024-04-15T07:36:03Z-
dc.identifier.isbn978-1-4673-8845-0 (Electronic)en_US
dc.identifier.isbn978-1-4673-8846-7 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/105710-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication J. Li, G. Ngai, H. V. Leong and S. C. F. Chan, "Your Eye Tells How Well You Comprehend," 2016 IEEE 40th Annual Computer Software and Applications Conference (COMPSAC), Atlanta, GA, USA, 2016, pp. 503-508 is available at https://doi.org/10.1109/COMPSAC.2016.220.en_US
dc.subjectComprehension detectionen_US
dc.subjectEye gazeen_US
dc.subjectReadingen_US
dc.titleYour eye tells how well you comprehenden_US
dc.typeConference Paperen_US
dc.identifier.spage503en_US
dc.identifier.epage508en_US
dc.identifier.volume2en_US
dc.identifier.doi10.1109/COMPSAC.2016.220en_US
dcterms.abstractSystems that adapt to changes in human needs automatically are useful, built upon advancements in human-computer interaction research. In this paper, we investigate the problem of how well the eye movement of a user when reading an article can predict the level of reading comprehension, which could be exploited in intelligent adaptive e-learning systems. We characterize the eye movement pattern in the form of eye gaze signal. We invite human subjects in reading articles of different difficulty levels being induced to different comprehension levels. Machine-learning techniques are applied to identify useful features to recognize when readers are experiencing difficulties in understanding their reading material. Finally, a detection model that can identify different levels of user comprehension is built. We achieve a performance improvement of over 30% above the baseline, translating over 50% reduction in detection error.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitation2016 IEEE 40th Annual Computer Software and Applications Conference (COMPSAC), 10-14 June 2016, Atlanta, Georgia, v. 2, p. 503-508en_US
dcterms.issued2016-
dc.identifier.scopus2-s2.0-84987994831-
dc.relation.conferenceIEEE Annual International Computer Software and Applications Conference [COMPSAC]-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-1465-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS9580616-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Li_Your_Eye_Tells.pdfPre-Published version1.35 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

14
Citations as of May 19, 2024

Downloads

1
Citations as of May 19, 2024

SCOPUSTM   
Citations

12
Citations as of May 17, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.