Back to results list
Show full item record
Please use this identifier to cite or link to this item:
|Title:||i*Chameleon : an MVC-based middleware framework for the support of multimodal application development||Authors:||Lo, Wai Kwan||Degree:||M.Phil.||Issue Date:||2014||Abstract:||Multimodal human computer interactions are becoming increasingly popular, especially in ubiquitous and pervasive computing applications. These applications demand highly responsive and intuitive human control interfaces. Because of their nature and form factor, the traditional keyboard, video and mouse interfaces are often not appropriate or adequate. As a result, there is much current research on developing novel interaction devices and sensors, or algorithms for signal processing. However, there are still challenges when it comes to integrating and customizing different heterogeneous devices into a human-centered multimodal application. In addition, owing to the static binding between user control and the application and the strong coupling between the application programming interface (API) and heterogeneous devices, the development of multimodal applications remains a difficult task. In this thesis, we introduce i*Chameleon, which not only leverages a principled and comprehensive development cycle that systematically captures the principles behind multimodal interaction, but also provides a configurable and extensible multimodal platform to support the development of highly interactive applications. Through the use of an MVC architectural pattern, it enforces the principle of separation-of-concerns to facilitate cross-collaboration between device engineers, programmers, modality designers and interaction designers who are working on different aspects of human computer interaction and programming. Collectively, the development efforts are combined, integrated and compiled by the i*Chameleon kernel to create the multimodal interactive application. During the execution, i*Chameleon also supports dynamic adaption across components according to the contextual information from the surrounding environment. For example, if a user is accessing a video via a regular smart phone on i*Chameleon; and a high-resolution display is then discovered, the video can be streamed to the display to take advantage of the higher resolution, without having to modify and re-compile the application, or even having to restart the application. This capability moves the multimodal applications closer to being human-centered rather than device-centered. In the process, usability and flexibility of the applications are enhanced.
To validate the soundness of i*Chameleon, we implemented the platform based on two approaches: web services and publish/subscribe architecture. We carried out two experimental applications, Mobile DJ and interactive robot exhibit. Mobile DJ was implemented over web services to test the support for multimodal interactions over distributed components in real time, regardless of the users' locations. Players browse and search for sound tracks that are currently being worked on by others based on the web services supported, which provides a channel for them to contribute collaboratively. In the second experiment, an interactive robot exhibit was developed using publish/subscribe middleware to demonstrate dynamic adaption. Modalities and devices can be changed according to the users' behavior (e.g., location) and the contextual environment (e.g., level of loudness). Both experimental applications produce positive results. The experience shows that the use of i*Chameleon can help to decompose the development process into different aspects and each aspect can be developed fairly independently. The overall achievement is that the interaction components become more reusable and the system itself becomes more flexible, validating the design of i*Chameleon.
|Subjects:||Multimodal user interfaces (Computer systems)
Hong Kong Polytechnic University -- Dissertations
|Pages:||xv, 99 leaves : illustrations ; 30 cm|
|Appears in Collections:||Thesis|
View full-text via https://theses.lib.polyu.edu.hk/handle/200/7736
Citations as of Jun 4, 2023
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.