Understanding the properties, challenges and social consequences of touchless body-based interactions
Through our bodily relationship and actions with objects, places, and each other, we are able to express and create a rich variety of meaning. Gestures, movement trajectories, spatial positioning, and proximity are all means through which we communicate, interpret, and control our world. With the emergence of ever more sophisticated sensing techniques we are able to interact in new ways with the digital world without touch. These new mechanisms open up opportunities for exploring new genres of experiences and applications in a variety of new contexts where touch-based interactions alone may neither be possible, desirable, or as engaging. The important concerns for this theme are to understand the properties, challenges, and social consequences of touchless body-based interactions and how to design experiences with these mechanisms that achieve value and meaning for people in everyday contexts.
- How do we perceive and engage with touchless systems?
- What new genres of applications are afforded by touchless interactions?
- How are opportunities for touchless interaction made visible and interpretable to ourselves and others?
- How do we ensure intentionality in touchless interaction systems?
- What are the social and collaborative properties of body-based touchless interfaces?
- What are the appropriate ways to couple body-based interactions with the digital world?
Touchless Interaction in Surgical Settings
Surgery is increasingly reliant on medical imaging, but a surgeon’s interaction with images is constrained by their need to maintain sterility. Re-scrubbing is time consuming, and instructing others to manipulate equipment interferes with a surgeon’s interpretation of the images. Implementation challenges of this environment include concealment of the lower half of the surgeon's body by the patient table and supporting multiple configurations of users, equipment, and displays. We have developed a system for manipulating 3D volumetric vascular renderings. The system supports surgeons to: pan, zoom, mark, rotate, and fade the images – both one-handed and two handed. This work is funded by MS Connections in collaboration with Lancaster University, Addenbrookes Hospital in Cambridge, King's College London, and Guys and St Thomas' Hospital in London. For more information and papers please visit the Touchless Interaction in Medical Imaging pages.
Kinect in the Dark
This project explores a single quality of the Kinect IR sensor – that it works in the dark. Darkness is a context that invokes fear as well as imagination. That takes away the sense of sight to the benefit of one’s other senses. Through this system a user discovers an invisible shape through sound feedback. The absence of visual distractions in the dark allows a user to become increasingly focused on their own movement and their spatial relationship to this virtual shape. A system that allows one to focus on the body (whether by being in the dark or by simply removing any peripheral devices like a display or controller) means that one is able to focus on the interaction in relation to their body as opposed to in relation to the outside world. When users were in the light they would look at their hands and look at reference points in the room. When they were in the dark, they let go of looking and instead felt where their hands are and thus where the shape was. This is interesting in relation to people stating that the Darkness was more natural. What is more natural is using proprioception as a feedback mechanism as opposed to visual cues. More information can be found in a recent CHI paper.
Qualities of Movement
What if not only what movement we do but also how we move could be picked up by a system in interaction with technology? We are exploring this in an interactive improvisational dance performance setting where audience members can influence the show through interacting with Kinect cameras, tweaked to pick up on movement qualities based on Laban Movement Analysis’s category effort. The project is done in collaboration between researchers at Microsoft Research, Cambridge, a choreographer and faculty member at Trinity Laban Conservatoire in London, and professional modern dancers.
Gesture Based Interaction in the Home
As the Kinect sensor is being extended from gaming to other applications and contexts, we critically examine the use of gestural interaction in smart homes. Through an exploratory study of family experiences with Kinect in gaming, we discuss the character of the experience as one that revels in one not being able to control their body to satisfaction. Through this analysis, we liken the third space defined by Kinect-based gestural interaction to that of Bakhtin’s Carnival and question the feasibility of that enchantment to other realms of home life.
With emerging opportunities for using Brain-Computer Interaction (BCI), there is a need to understand the opportunities and constraints of this interaction paradigm, in particular in the context of real work settings. The project draws on the philosophical traditions of embodied interaction, and points to the importance of considering bodily experiences in BCI and not simply what is going on in the head. What this highlights is that bodily actions are used to facilitate control of brain activity but also to make people’s actions and intentions visible to, and interpretable by, others playing and watching the game. It is the public availability of these bodily actions during BCI that allows action to be socially organised, understood and coordinated with others and through which social relationships can be played out. For more information please see our CHI paper "Embodiment in Brain-Computer Interaction".
Trial of "touchless" gaming technology in surgery, Adam Brimelow, BBC News, Health, (May 31, 2012)
Touchless technology put to test by surgeons (video), Adam Brimelow, BBC News (May 31, 2012)
Kinect imaging lets surgeons keep their focus, MacGregor Campbell, New Scientist, Tech (May 17, 2012)
Interacting without Touching, Inside Microsoft Research (March 8, 2012)
Microsoft's TechFest Trots Out 'What is Now Possible' for Computers, The Seattle Times, Business/Technology (March 7, 2012)
Microsoft Installe Kinect Dans les Salles D'operation, 01net. (March 7, 2012) [in French]
Microsoft Shows Off Kinect-Based Projects at TechFest Research Fair, The Tech Journal (March 6, 2012)
Microsoft showcases new Kinect-centric projects at its TechFest Research Fair, ZDNet (March 6, 2012)
- Kenton O'Hara, Gerardo Gonzalez, Abigail Sellen, Graeme Penney, Varnavas, Helena Mentis, Antonio Criminisi, Robert Corish, Mark Rouncefield, Neville Dastur, and Tom Carrell, Touchless Interaction in Surgery, in Communications of the ACM, December 2014.
- Helena M. Mentis and Alex S. Taylor, Imaging the body: embodied vision in minimally invasive surgery, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, April 2013.
- Kenton O'Hara, Richard Harper, Helena Mentis, Abigail Sellen, and Alex Taylor, On the naturalness of touchless: putting the “interaction” back into NUI, in ACM Transactions on Computer-Human Interaction, ACM Transactions on Computer-Human Interaction, 2013.
- Linden Vongsathorn, Kenton O'Hara, and Helena Mentis, Bodily Interaction in the Dark, ACM Conference on Human Factors in Computing Systems, 2013.
- Helena M. Mentis, Kenton O'Hara, Abigail Sellen, and Rikin Trivedi, Interaction Proxemics and Image Use in Neurosurgery, ACM Conference on Computer-Human Interaction, 2012.
- Dustin Freeman, Otmar Hilliges, Abigail Sellen, Kenton O'Hara, Shahram Izadi, and Ken Woodberry, The Role of Physical Controllers in Motion Video Gaming, in Designing Interactive Systems (DIS 2012), ACM, 2012.
- Simon Fothergill, Helena M. Mentis, Sebastian Nowozin, and Pushmeet Kohli, Instructing People for Training Gestural Interactive Systems, ACM Conference on Computer-Human Interaction, 2012.
- Rose Johnson, Kenton O'Hara, Abigail Sellen, Claire Cousins, and Antonio Criminisi, Exploring the Potential for Touchless Interaction in Image Guided Interventional Radiology, in ACM Conference on Computer-Human Interaction (CHI). Honourable Mention Award, ACM Conference on Computer-Human Interaction, 7 May 2011.
- Kenton O'Hara, Abigail Sellen, and Richard Harper, Embodiment in Brain-Computer Interaction, ACM Conference on Computer-Human Interaction, 7 May 2011.
- Kenton O'Hara, Abigail Sellen, and Richard Harper, Embodiment in brain-computer interaction, in Proceedings of CHI 2011, ACM, 2011.
- Rose Johnson, kenton O'Hara, Abigail Sellen, Claire Cousins, and Antonio Criminisi, Exploring the potential for touchless interaction in image-guided interventional radiology, in Proceedings of CHI 2011, ACM, 2011.
Email: Kenton O'Hara for copies of papers that are not yet published.