Please use this identifier to cite or link to this item:
Title: A lattice-theoretic approach to runtime property detection for pervasive context
Authors: Hua, T
Huang, Y
Cao, J 
Tao, X
Issue Date: 2010
Source: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics), 2010, v. 6406 LNCS, p. 307-321
Abstract: Runtime detection of contextual properties is one of the primary approaches to enabling context-awareness. Existing property detection schemes implicitly assume that contexts under detection belong to the same snapshot of time. However, this assumption does not necessarily hold in the asynchronous pervasive computing environments. To cope with the asynchrony, we first model environment behavior based on logical time. One key notion of our model is that all meaningful observations of the environment have the lattice structure. Then we propose the LAT algorithm, which maintains the lattice of meaningful observations at runtime. We also propose the LATPD algorithm, which achieves detection of contextual properties at runtime. We implement algorithms over the open-source context-aware middleware MIPA, and simulations are conducted. The evaluation results show that LAT and LATPD support effective detection of contextual properties in asynchronous environments.
Keywords: Asynchronous environment
Property detection
Publisher: Springer
Journal: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics) 
ISBN: 3642163548
ISSN: 0302-9743
EISSN: 1611-3349
DOI: 10.1007/978-3-642-16355-5-26
Description: 7th International Conference on Ubiquitous Intelligence and Computing, UIC 2010, Xi'an, 26-29 October 2010
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Sep 3, 2020

Page view(s)

Last Week
Last month
Citations as of Sep 15, 2020

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.