Listen-n-feel: An Emotion Sensor on the Phone Using Speech Processing and Cloud Computing

Speaker  Na Yang

Affiliation  Microsoft Intern

Host  Arjmand Samuel

Duration  00:34:29

Date recorded  2 August 2011

This talk will present an emotion sensor on Windows phones named ‘Listen-n-Feel’, which listens to the phone user’s speech and tells whether the user is happy or sad, based on audio signal features. This phone application can be widely used in social networks, integrated in character-playing games, or used to monitor patients with mental problems or other health care area. Recorded audio data is processed on the cloud and signal features are extracted, both in the time domain and frequency domain. Machine learning method is applied to predict emotions on statistics of speech signal features, with the training data derived from a prosody database. The emotion detection application will also be demoed on the presentation.

©2011 Microsoft Corporation. All rights reserved.
> Listen-n-feel: An Emotion Sensor on the Phone Using Speech Processing and Cloud Computing