Muscle-Computer Interfaces (muCIs)

Many human-computer interaction technologies are currently mediated by physical transducers such as mice, keyboards, pens, dials, and touch-sensitive surfaces. While these transducers have enabled powerful interaction paradigms and leverage our human expertise in interacting with physical objects, they tether computation to a physical artifact that has to be within reach of the user.

As computing and displays begin to integrate more seamlessly into our environment and are used in situations where the user is not always focused on the computing task, it is important to consider mechanisms for acquiring human input that may not necessarily require direct manipulation of a physical implement. We explore the feasibility of muscle-computer input: an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible.

Video

Projects

Enabling Always-Available Input with Muscle-Computer Interfaces
(ACM UIST 2009)

emg_gesture

We extend our previous results to bring us closer to using muscle-computer interfaces for always-available input in real-world applications. We leverage existing taxonomies of natural human grips to develop a gesture set covering interaction in free space even when hands are busy with other objects. We present a system that classifies these gestures in real-time and we introduce a bi-manual paradigm that enables use in interactive systems. We report experimental results demonstrating four-finger classification accuracies averaging 79% for pinching, 85% while holding a travel mug, and 88% when carrying a weighted bag. We further show generalizability across different arm postures and explore the tradeoffs of providing real-time visual feedback.

Enhancing Input On and Above the Interactive Surface with Muscle Sensing
(ACM Tabletop 2009)

emg_armband

Current interactive surfaces provide little or no information about which fingers are touching the surface, the amount of pressure exerted, or gestures that occur when not in contact with the surface. These limitations constrain the interaction vocabulary available to interactive surface systems. In our work, we extend the surface interaction space by using muscle sensing to provide complementary information about finger movement and posture. In this paper, we describe a novel system that combines muscle sensing with a multi-touch tabletop, and introduce a series of new interaction techniques enabled by this combination. We present observations from an initial system evaluation and discuss the limitations and challenges of utilizing muscle sensing for tabletop applications.

Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces
(ACM CHI 2008)

emg_surface

As a first step towards realizing the muCI concept, we conducted an experiment to explore the potential of exploiting muscular sensing and processing technologies for muCIs. We present results demonstrating accurate gesture classification with an off-the-shelf electromyography (EMG) device. Specifically, using 10 sensors worn in a narrow band around the upper forearm, we were able to differentiate position and pressure of finger presses, as well as classify tapping and lifting gestures across all five fingers. We conclude with discussion of the implications of our results for future muCI designs.