Multimodal Gaze-Supported Interaction

Speaker  Sophie Stellmach

Affiliation  Technische Universität Dresden

Host  sarahn

Duration  00:54:17

Date recorded  12 November 2013

While our eye gaze represents an important medium for perceiving our environment, it also serves as a fast and implicit way for signaling interest in somebody or something. This could also benefit a flexible and convenient interaction with diverse computing systems ranging from small handheld devices to multiple large-sized screens. Considerable research has already been pursued on gaze-only interaction, which is however often described as error-prone, imprecise, and unnatural. To overcome these challenges, multimodal combinations of gaze with additional input modalities show a high potential for fast, fluent, and convenient human-computer interaction in diverse user contexts. Promising examples for this novel style of multimodal gaze-supported interaction are the seamless selection and manipulation of graphical objects displayed on distant screens by using a combination of a mobile handheld (such as a smartphone) and gaze input.

In my talk, I will provide a brief introduction to gaze-based interaction in general and present insights into my research at the Interactive Media Lab. Thereby, I will particularly emphasize the high potential of the emerging area of multimodal gaze-supported interaction.

©2013 Microsoft Corporation. All rights reserved.
> Multimodal Gaze-Supported Interaction