Learn How Microsoft Researchers Are Using Wearable Technology to Read Your Moods

Published

Posted by Kelly Berschauer

Microsoft Research Podcast

AI Frontiers: AI for health and the future of research with Peter Lee

Peter Lee, head of Microsoft Research, and Ashley Llorens, AI scientist and engineer, discuss the future of AI research and the potential for GPT-4 as a medical copilot.

Mary Czerwinski (left) and Asta Roseway

You can feel the stress building—you’re on deadline, your computer has stalled to a standstill, you’re pounding keys in frustration, and your blood is boiling. You’re about to explode.

And at that exact moment, your computer tells you to take deep breath and a walk.

Thanks to a team of Microsoft researchers within the VIBE group (Visualization and Interaction for Business and Entertainment) within Microsoft Research, the technology that would make that intervention possible is a work in progress focusing on human-computer interaction and clinical psychology. Three years ago, the team started working in the area of affective computing: designing systems—some including wearable computing devices—that attempt to identify your mood and react accordingly, in order to help you reflect on your own state.

On Nov. 20, Mary Czerwinski, principal researcher in the VIBE group, will deliver the closing keynote of the AMIA 2013 Annual Symposium, being held in Washington, D.C., where she’ll share her team’s innovative research to advance the field of affective computing with the health community.

Intrigued? Can’t be there in person and want to know more? We did too, so we turned to Channel 9 to shine the spotlight in the latest installment of the Microsoft Research Luminaries video series on two of the researchers making affective computing a reality, Czerwinski and Asta Roseway, principal research designer.

A key tenet of the team’s work is understanding and aiding emotional health to improve the quality of life. Czerwinski says: “Our research goes beyond traditional fitness. It’s about emotional fitness.”

There are all kinds of ways a system could detect what you’re feeling, such as utilizing a variety of sensors that monitor your facial features, how quickly you are typing, the intensity of each keystroke, or the stress in your voice. The combination of machine learning and data analytics could potentially tie together all this data to predict accurately how you are feeling.

Your computer may not be able to read you—yet—but research in affective computing could bring that to a reality soon.

Be sure to tune into Channel 9 to hear more about the novel ways this research is extending the boundaries of affective computing.