Microsoft Research
Computational User Experiences

Always-Available Mobile Interfaces

We have continually evolved computing to not only be more efficient, but also more accessible, more of the time (and place), and to more people. We have progressed from batch computing with punch cards, to interactive command line systems, to mouse-based graphical user interfaces, and more recently to mobile computing. Each of these paradigm shifts has drastically changed the way we use technology for work and life, often in unpredictable and profound ways.

With the latest move to mobile computing, we now carry devices with significant computational power and capabilities on our bodies. However, their small size typically leads to limited interaction space (diminutive screens, buttons, and jog wheels) and consequently diminishes their usability and functionality. This presents a challenge and an opportunity for developing interaction modalities that will open the door for novel uses of computing.

Our work addresses these challenges by appropriating both the human body and the surrounding environment as interaction canvases. We achieve this by leveraging sensors used in medical contexts, and by applying signal processing and machine learning techniques that extract data about gesture and human behavior from those sensors.




Automatic Exercise Recognition

Workout: Automatic Exercise Analysis

Although numerous devices exist to track and share exercise routines based on running and walking, these devices offer limited functionality for strength-training exercises. We introduce a system for automatically tracking repetitive exercises — such as weight training and calisthenics — via an arm-worn inertial sensor, with no user-specific training and no intervention during a workout.


Non-Contact Haptic Feedback Using Air Vortex Rings

We explore the use of air vortex rings to enable at-a-distance haptics. Unlike standard jets of air, which are turbulent and dissipate quickly, vortex rings can be focused to travel several meters and impart perceptible feedback.


Enabling Mobile Phones to Infer Where They Are Kept

We collected data from 693 participants to understand where people keep their phone in different contexts and why. Using this data, we identified three placement personas: Single Place Pat, Consistent Casey, and All-over Alex. We also built prototypes employing capacitive, multispectral, and accelerometer sensing to infer phone placements automatically.


Using the Doppler Effect to Sense Gestures

We present SoundWave, a technique that leverages commodity speakers and microphones to sense in-air gestures. We generate an inaudible tone, which gets frequency-shifted when it reflects off of moving objects; we measure this shift with the microphone to infer various gestures.


Sensing Gestures Using the Body as an Antenna

Home environments frequently offer a signal that is unique to locations and objects within the home: electromagnetic noise. In this work, we use the body as a receiving antenna and leverage this noise for gestural interaction.



PocketTouch utilizes capacitive sensing to detect finger-strokes through fabric (e.g., while your phone is still in your pocket).

Muscle-Computer Interfaces

Muscle-Computer Interfaces

Muscle-computer interfaces directly sense and decode human muscular activity rather than relying on physical actuation or perceptible user actions. We believe that this is the first step towards tapping into the vast amount of information contained within the human physiology.

Skinput: Appropriating the Body as an Input Surface

Skinput: Bioacoustic Sensing for Input

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body.



Phoneprioception: Enabling Mobile Phones to Infer Where They Are Kept

Jason Wiese, T. Scott Saponas, A.J. Bernheim Brush

Proceedings of ACM CHI 2013, May 2013

An Ultra-Low-Power Human Body Motion Sensor Using Static Electric Field Sensing

Gabe Cohn, Sidhant Gupta, Tien-Jui Lee, Dan Morris, Joshua Smith, Matthew Reynolds, Desney Tan, Shwetak Patel

Proceedings of ACM Ubicomp 2012, September 2012 (Best paper award)

Humantenna: Using the Body as an Antenna for Real-Time Whole-Body Interaction

Gabe Cohn, Dan Morris, Shwetak Patel, Desney Tan

Proceedings of ACM CHI 2012, May 2012 (Best paper nomination)

SoundWave: Using the Doppler Effect to Sense Gestures

Sidhant Gupta, Dan Morris, Shwetak Patel, Desney Tan

Proceedings of ACM CHI 2012, May 2012

Making Gestural Input from Arm-Worn Inertial Sensors More Practical

Louis Kratz, T. Scott Saponas, Dan Morris

Proceedings of ACM CHI 2012, May 2012

GyroTab: A Handheld Device that Provides Reactive Torque Feedback

Akash Badshah, Sidhant Gupta, Dan Morris, Shwetak Patel, Desney Tan

Proceedings of ACM CHI 2012, May 2012

Emerging Input Technologies for Always-Available Mobile Interaction

Dan Morris, Scott Saponas, Desney Tan

Foundations and Trends in Human-Computer Interaction, 4(4), pp. 245-316, November 2011

PocketTouch: Through-Fabric Capacitive Touch Input

T. Scott Saponas, Chris Harrison, Hrvoje Benko

Proceedings of UIST 2011, October 2011

Skinput: Appropriating the Skin as an Interactive Canvas

Chris Harrison, Desney Tan, Dan Morris

Communications of the ACM 54.8, August 2011

Your Noise is My Command: Sensing Gestures Using the Body as an Antenna

Gabe Cohn, Dan Morris, Shwetak Patel, Desney S Tan

Proceedings of ACM CHI 2011, May 2011 (Best paper award)

Interfaces on the Go: Enabling Mobile Micro-Interactions with Physiological Computing

Desney Tan, Dan Morris, and Scott Saponas

ACM Crossroads Magazine, June 2010

Skinput: Appropriating the Body as an Input Surface

Chris Harrison, Desney S Tan, Dan Morris

Proceedings of ACM CHI 2010, April 2010 (Best paper award)

Making Muscle-Computer Interfaces More Practical

Scott Saponas, Desney Tan, Dan Morris, Jim Turner, and James Landay

Proceedings of ACM CHI 2010, April 2010

Enhancing Input On and Above the Interactive Surface with Muscle Sensing

Hrvoje Benko, Scott Saponas, Dan Morris, and Desney Tan

Proceedings of ACM Tabletop 2009, November 2009

Enabling Always-Available Input with Muscle-Computer Interfaces

Scott Saponas, Desney Tan, Dan Morris, Ravin Balakrishnan, Jim Tuner, James Landay

Proceedings of ACM UIST 2009, October 2009

Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces

T. Scott Saponas, Desney S Tan, Dan Morris, Ravin Balakrishnan

Proceedings of ACM CHI 2008, April 2008


[download video] (wmv, 23MB)

[download video] (wmv, 30MB)

[download video] (wmv, 30MB)

[download video] (wmv, 30MB)



Control your laptop with a wave of your hand

CNN Money, 7 August 2012

Next Up in Kinect-Style Motion Sensing: Ultrasound?

Popular Mechanics, 25 May 2012

Microsoft Turns Jazz Hands Into Gesture Commands Using Sound Waves

IT World, 10 May 2012

Microsoft Research Projects Offer New Takes on Gesture Sensing

IDG News Service, 9 May 2012

Beyond Kinect: Gestural Computer Spells Keyboard Death

New Scientist, 9 May 2012

Gesture Sensing Alternatives Use Radio Interference, Doppler Effect

PC World, 9 May 2012

Laptop Uses Sound for Gesture Control

Discovery News, 9 May 2012

Cool Microsoft Research Takes Kinect To Another Level

PC Magazine, 7 May 2012

Gesture Control System Uses Sound Alone

Technology Review, 7 May 2012

Here’s Looking at You (but I’m Still Texting)

New York Times, 11 Feb 2012

Stealth Texting

Technology Review, 1 Jan 2012

10 Tech Research Projects to Watch (featuring PocketTouch)

PC World, 3 Jan 2012


How to Make a Human Antenna (at ABC News) (at MSNBC)

Discovery News, 12 May 2011

Turn your entire home into a game controller

New Scientist, 10 May 2011

Talking to the Wall

Technology Review, 3 May 2011


Microsoft’s Skinput turns hands, arms into buttons

CNN, 19 April 2010

Skinput Makes the Entire Body a Touch Interface

PC World, 13 April 2010

Sensors turn skin into gadget control pad

BBC News, 26 March 2010

Body acoustics can turn your arm into a touchscreen

New Scientist, 1 March 2010

‘Skinput’ Turns Your Body Into Touchscreen Interface

TechNews, 3 March 2010

Skinput Turns Any Bodily Surface Into a Touch Interface

Popular Science (, 3 March 2010

Skinput Turns Your Arm into a Touch-Screen

Wired, 3 March 2010


Muscle-Based PC Interface Lets You Literally Point and Click, No Mouse Required

Popular Science (, 29 October 2009

Muscle-Bound Computer Interface

MIT Technology Review, 28 October 2009

The Quest for a Better Keyboard

Forbes Magazine, September 2009


High-tech Armband Puts your Fingers in Control

The New Scientist, 24 April 2008


Contact Scott Saponas and Dan Morris for questions about our work in this area.

Contact Us Terms of Use Trademarks Privacy Statement ©2010 Microsoft Corporation. All rights reserved.Microsoft