Please use this identifier to cite or link to this item:
Title: Eye movement data modeling using a genetic algorithm
Authors: Zhang, Y
Fu, H
Liang, Z
Zhao, X
Chi, Z 
Feng, D
Zhao, X
Keywords: Computer vision
Genetic algorithms
Target tracking
Issue Date: 2009
Publisher: IEEE
Source: IEEE Congress on Evolutionary Computation, 2009 : CEC '09, 18-21 May 2009, Trondheim, p. 1038-1044 How to cite?
Abstract: We present a computational model of human eye movements based on a genetic algorithm (GA). The model can generate elemental raw eye movement data in a four-second eye viewing window with a 25 Hz sampling rate. Based on the physiology and psychology characters of human vision system, the fitness function of the GA model is constructed by taking into consideration of five factors including the saliency map, short time memory, saccades distribution, Region of Interest (ROI) map, and a retina model. Our model can produce the scan path of a subject viewing an image, not just several fixations points or artificial ROI's as in the other models. We have also developed both subjective and objective methods to evaluate the model by comparing its behavior with the real eye movement data collected from an eye tracker. Tested on 18 (9 times 2) images from both an obvious-object image group and a non-obvious-object image group, the subjective evaluations shows very close scores between the scan paths generated by the GA model and those real scan paths; for the objective evaluation, experimental results show that the distance between GA's scan paths and human scan paths of the same image has no significant difference by a probability of 78.9% on average.
ISBN: 978-1-4244-2958-5
978-1-4244-2959-2 (E-ISBN)
DOI: 10.1109/CEC.2009.4983060
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Citations as of Feb 12, 2016


Last Week
Last month
Citations as of Aug 14, 2017

Page view(s)

Last Week
Last month
Checked on Aug 13, 2017

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.