You are here

Device Binding for Adaptive Multimodal Interfaces

Speaker: 
Srihathai Prammanee

Multimodal interaction is one of the taxonomies for Human-Computer Interaction (HCI). With the introduction of multimodal interactions, input/output information is becoming associated with the different human senses so that information can be presented in the most efficient and natural way. However, in mobile communication, a number of restrictions are still remnant. Mostly, these restrictions are caused by limitations of a mobile terminal’s user interfaces.

This contribution introduces an architectural framework to facilitate multimodal interaction in a virtual-device environment. The framework developed is called the Multi Interface-Device Binding (MID-B) system. MID-B provides the functions and features to overcome the drawbacks of classic multimodal interaction. In the classical sense, multimodality uses a strategy that simultaneously utilises several modalities generally offered on a ‘single’ device. In contrast, the MID-B’s mechanisms take multimodality out of the single-device scenario. In MID-B, a ‘controller-device’ is aware of the availability of various devices in the vicinity, each of which may host one or more user interfaces (modalities). The capabilities of those co-located devices, together with the context in which the user acts, are exploited to dynamically customise the interface services available. MID-B binds these devices into a virtual device to exploit their individual user interfaces (modalities) in a combined way.

The work describes the MID-B architecture and its mechanisms to collect the context information of ‘devices’ and ‘users’. The thesis presents the methodologies to exploit that context information to dynamically adapt user interfaces.

Date and time: 
Wednesday, 21 November, 2007 - 11:00
Length: 
45 minutes
Location: 
Leith