This project explores the use of new touchless technology in medical practice.
With advances in medical imaging over the years, surgical procedures have become increasingly reliant on a range of digital imaging systems for navigation, reference, diagnosis and documentation. The need to interact with images in these surgical settings offers particular challenges arising from the need to maintain boundaries between sterile and non-sterile aspects of the surgical environment and practices. Traditional input devices such as keyboard, mouse and touch screen surfaces are reliant physical contact. Such contact-based interaction introduces the possibility for contaminated material to be transferred between the sterile and non-sterile. This constraint creates difficulties for surgical staff who are scrubbed up and are dependent upon others to manipulate images on their behalf. This can create inefficiencies, which in turn can entail potential medical complications. Additionally, it can interfere with the surgeon’s interpretive and analytic use of the images.
The aim of the project then is to explore the use of touchless interaction within surgical settings, allowing images to be viewed, controlled and manipulated without contact through the use of camera-based gesture recognition technology. In particular, the project seeks to understand the challenges of these environments for the design and deployment of such systems, as well as articulate the ways in which these technologies may alter surgical practice. While our primary concerns here are with maintaining conditions of asepsis, the use of these touchless gesture-based technologies offers other potential uses. For example, such systems offer interesting possibilities for interacting with emerging 3D imaging technologies. They also enable new possibilities for how the surgeons spatially configure themselves with respect to the various screens within surgical settings by enabling interaction at a distance. Such technologies then also offer potential to re-imagine the spatial environments within which image-based surgery takes place.
Trial of "touchless" gaming technology in surgery, Adam Brimelow, BBC News Health (May 31, 2012)
Touchless technology put to test by surgeons (video), Adam Brimelow, BBC News (May 31, 2012)
Kinect imaging lets surgeons keep their focus, MacGregor Campbell, New Scientist, Tech (May 17, 2012)
Interacting without Touching, Inside Microsoft Research (March 8, 2012)
Microsoft's TechFest Trots Out 'What is Now Possible' for Computers, The Seattle Times, Business/Technology (March 7, 2012)
Microsoft Installe Kinect Dans les Salles D'operation, 01net (March 7, 2012) [in French]
Microsoft Shows Off Kinect-Based Projects at TechFest Research Fair, The Tech Journal (March 6, 2012)
Microsoft showcases new Kinect-centric projects at its TechFest Research Fair, ZDNet (March 6, 2012)
- Helena M. Mentis, Kenton O'Hara, Abigail Sellen, and Rikin Trivedi, Interaction Proxemics and Image Use in Neurosurgery, ACM Conference on Computer-Human Interaction, 2012
- Antonio Criminisi, Jamie Shotton, and Ender Konukoglu, Decision Forests: A Unified Framework for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning, in Foundations and Trends® in Computer Graphics and Vision: Vol. 7: No 2-3, pp 81-227, NOW Publishers, 2012
- Ross Girshick, Jamie Shotton, Pushmeet Kohli, Antonio Criminisi, and Andrew Fitzgibbon, Efficient Regression of General-Activity Human Poses from Depth Images, in ICCV, IEEE, October 2011
- Rose Johnson, Kenton O'Hara, Abigail Sellen, Claire Cousins, and Antonio Criminisi, Exploring the Potential for Touchless Interaction in Image Guided Interventional Radiology, in ACM Conference on Computer-Human Interaction (CHI). Honourable Mention Award, ACM Conference on Computer-Human Interaction, 7 May 2011
See also the Medical Image Analysis project page: http://research.microsoft.com/medicalimageanalysis/