Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
BodyScope: A Wearable Acoustic Sensor for Activity Recognition

Koji Yatani and Khai N. Truong


Accurate activity recognition enables the development of a variety of ubiquitous computing applications, such as context-aware systems, lifelogging, and personal health systems. Wearable sensing technologies can be used to gather data for activity recognition without requiring sensors to be installed in the infrastructure. However, the user may need to wear multiple sensors for accurate recognition of a larger number of different activities. We developed a wearable acoustic sensor, called BodyScope, to record the sounds produced in the user’s throat area and classify them into user activities, such as eating, drinking, speaking, laughing, and coughing. The F-measure of the Support Vector Machine classification of 12 activities using only our BodyScope sensor was 79.5%. We also conducted a small-scale in-the-wild study, and found that BodyScope was able to identify four activities (eating, drinking, speaking, and laughing) at 71.5% accuracy.


Publication typeInproceedings
Published inACM International Conference on Ubiquitous Computing
> Publications > BodyScope: A Wearable Acoustic Sensor for Activity Recognition