Speaker Jeremy Gummeson
Host Bodhi Priyantha
Affiliation University of Massachusetts- Amherst
Date recorded 18 September 2013
Performing gesture recognition without infrastructure has been a long desired feature for seamless human computer interaction. In recent years, tablets and smartphones have made two-dimensional gestures a truly ubiquitous input modality. Yet, as computing becomes more pervasive, capacitive touch surfaces may not always be readily available or appropriate in many use case. Small form factor wearable computing devices that detect finger movements are an alternative way to approach this problem.
To address this, we developed a wearable platform that detects movements of a finger using audio and 3D acceleration. This platform converts finger movement data into a set of primitives that can be used for handwriting recognition or UI navigation in a context sensitive manner. To detect these gestures we use a sensor fusion approach that uses audio generated by surface friction to detect movement along a surface and acceleration to determine the direction of finger movement while in contact with the surface. By looking at the relationship between the envelope of audio emitted by surface friction and the acceleration from movement, we can disambiguate gesture data from spurious finger movements at low power cost. We implemented a platform prototype that collects audio and acceleration data and found our approach accurately detected gestures for several users. Our circuit design is amenable to small form factor implication with the end goal of a platform that can be worn as a ring.
©2013 Microsoft Corporation. All rights reserved.