Daniel McDuff, Amy Karlson, Ashish Kapoor, Asta Roseway, and Mary Czerwinski
We present AffectAura, an emotional prosthetic that allows users to reflect on their emotional states over long periods of time. We designed a multimodal sensor set-up for conti-nuous logging of audio, visual, physiological and contex-tual data, a classification scheme for predicting user affec-tive state and an interface for user reflection. The system continuously predicts a user's valence, arousal and engage-ment, and correlates this with information on events, com-munications and data interactions. We evaluate the interface through a user study consisting of six users and over 240 hours of data, and demonstrate the utility of such a reflec-tion tool. We show that users could reason forward and backward in time about their emotional experiences using the interface, and found this useful.
|Published in||CHI 2012|