Please use this identifier to cite or link to this item:
Title: Your mouse reveals your next activity : towards predicting user intention from mouse interaction
Authors: Fu, EY 
Kwok, TCK 
Wu, EY 
Leong, HV 
Ngai, G 
Chan, SCF 
Keywords: User intention
Mouse interaction
Multimodal model
Issue Date: 2017
Publisher: Institute of Electrical and Electronics Engineers
Source: 41st IEEE Annual Computer Software and Applications Conference, COMPSAC 2017, Torino, Italy, 4 - 8 July 2017, v. 1, 8029710, p. 869-874 How to cite?
Abstract: This paper presents an investigation into user intention prediction in two common web-based tasks: Crowdsourcing annotation and web search, based on human-mouse interaction information. User experience is gaining importance within the research area of human-centered computing, and is particularly useful for complex, multi-step tasks. To enhance user experience, the computer should be intelligent enough to be able to predict the user intention. For instance, an intelligent agent might be able to anticipate when the user is about to press a button, and helpfully enlarge or highlight it in advance. In this paper, we propose two prediction models on user intention: A classical model that considers only historical mouse activity sequence, and a multimodal model that utilizes mouse interaction signals as well as features extracted from mouse trajectory and clicking events. We evaluate our models and find that they achieve reasonable accuracy. Our preliminary results indicate that we can dynamically learn a multimodal model that can effectively predict a user's next activity from historical activity sequence and mouse interaction signals.
ISBN: 9781538603673
ISSN: 0730-3157
DOI: 10.1109/COMPSAC.2017.270
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Nov 7, 2018


Last Week
Last month
Citations as of Nov 9, 2018

Page view(s)

Citations as of Nov 12, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.