CHI 2012, the Association for Computing Machinery’s Conference on Human Factors in Computing Systems, is the premier international conference on human-computer interaction. CHI 2012 emphasizes using human-computer interaction as a tool for connecting people, cultures, technologies, experiences, and ideas.
Microsoft Research contributed to this year's event through a diverse range of papers, notes, presentations, projects, and organizational leadership. Exploring how user interfaces can become more natural and the ways in which computers can work even more seamlessly on our behalf, Microsoft Research is developing technologies that advance the state of the art of computing and change our interaction with society and the environment.
Read on to discover the latest in human-computer interaction and Natural User Interface (NUI) developments.
|CHI 2012 Melds Physical and Virtual
Microsoft researchers participating in CHI 2012 show new ways for sound, light, motion, and ordinary consumer electronics to enhance human interaction with computers.
SoundWave: Using the Doppler Effect to Sense Gestures
Gestures are becoming an increasingly popular means of interacting with computers. However, it is still relatively costly to deploy robust gesture-recognition sensors in existing mobile platforms. SoundWave is a real-time sensing technique that leverages a speaker and a microphone to robustly sense in-air gestures and motion around a device. It is capable of detecting a variety of gestures, and can directly control existing applications without requiring a user to wear any special sensors.
Learn more >>
Humantenna: Using the Body as an Antenna for Real-Time Whole-Body Interaction
Humantenna senses whole-body gestures without any instrumentation to the environment and only minimal instrumentation to the user. Leveraging the existing electromagnetic noise coming from appliances and power lines, the human body acts as an antenna and receives this noise, which this project uses as its signal. By measuring the voltage over time over one surface of the body, we are able to classify which gesture the user is performing. Humantenna can also identify the location of the user as well as the gestures being performed. This location and user information can be used for a variety of context aware ubiquitous computer applications.
Learn more >>
Interaction Proxemics and Image Use in Neurosurgery
With advances in medical imaging over the years, surgical procedures have become increasingly reliant on a range of digital imaging systems for navigation, reference, diagnosis and documentation. The need to interact with images in these surgical settings offers particular challenges arising from the need to maintain boundaries between sterile and non-sterile aspects of the surgical environment and practices. Traditional input devices such as keyboard, mouse and touch screen surfaces are reliant physical contact. Such contact based interaction introduces the possibility for contaminated material to be transferred between the sterile and non-sterile. This constraint creates difficulties for surgical staff who are scrubbed up who are dependent upon others to manipulate images on their behalf. This can create inefficiencies, which in turn can entail potential medical complications. But it can also interfere with the surgeon’s interpretive and analytic use of the images.
Learn more >>