Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/64378
PIRA download icon_1.1View/Download Full Text
Title: Building a personalized, auto-calibrating eye tracker from user interactions
Authors: Huang, MX 
Kwok, TCK 
Ngai, G 
Chan, SCF 
Leong, HV 
Issue Date: 2016
Source: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16, San Jose, California, USA, May 07 - 12, 2016, p. 5169-5179
Abstract: We present PACE, a Personalized, Auto-Calibrating Eyetracking system that identifies and collects data unobtrusively from user interaction events on standard computing systems without the need for specialized equipment. PACE relies on eye/facial analysis of webcam data based on a set of robust geometric gaze features and a two-layer data validation mechanism to identify good training samples from daily interaction data. The design of the system is founded on an in-depth investigation of the relationship between gaze patterns and interaction cues, and takes into consideration user preferences and habits. The result is an adaptive, data-driven approach that continuously recalibrates, adapts and improves with additional use. Quantitative evaluation on 31 subjects across different interaction behaviors shows that training instances identified by the PACE data collection have higher gaze point-interaction cue consistency than those identified by conventional approaches. An in-situ study using real-life tasks on a diverse set of interactive applications demonstrates that the PACE gaze estimation achieves an average error of 2.56°, which is comparable to state-of-theart, but without the need for explicit training or calibration. This demonstrates the effectiveness of both the gaze estimation method and the corresponding data collection mechanism.
Keywords: Gaze estimation
Implicit modeling
Data validation
Gaze-interaction correspondence
Publisher: ACM Press
ISBN: 9781450333627 (print)
DOI: 10.1145/2858036.2858404
Rights: ©2016 ACM. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in Proceeding CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems Pages 5169-5179, http://dx.doi.org/10.1145/10.1145/2858036.2858404
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org.
The following publication Huang, M. X., Kwok, T. C., Ngai, G., Chan, S. C., & Leong, H. V. (2016, May). Building a personalized, auto-calibrating eye tracker from user interactions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 5169-5179). New York: ACM is available at https://doi.org/10.1145/2858036.2858404
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
Huang_Building_Personalized_Auto-Calibrating.pdfPre-Published version1.82 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

109
Last Week
1
Last month
Citations as of Apr 14, 2024

Downloads

157
Citations as of Apr 14, 2024

SCOPUSTM   
Citations

49
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

38
Last Week
0
Last month
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.