Evaluation of a Self-report System for Assessing Mood Using Facial Expressions
Effective and frequent sampling of mood through self-reports could enable a better understanding of the interplay between mood and events influencing it. To accomplish this, we built a mobile application featuring a sadness-happiness visual analogue scale and a facial expression-based scale. The goal is to evaluate, whether a facial expression based scale could adequately capture mood. The method and mobile application were evaluated with 11 participants. They rated the mood of characters presented in a series of vignettes, using both scales. Participants also completed a user experience survey rating the two assessment methods and the mobile interface. Findings reveal a Pearson’s correlation coefficient of 0.97 between the two assessment scales and a stronger preference for the face scale. We conclude with a discussion of the implications of our findings for mood self-assessment and an outline future research.
KeywordsMood assessment Self-report system User interface
This work has been supported by AffecTech: Personal Technologies for Affective Health, Innovative Training Network funded by the H2020 People Programme under Marie Skłodowska-Curie grant agreement No. 722022.
- 4.Duque, A., Vázquez, C.: Mental Health; researchers at Complutense University have reported new data on depression (Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study). Mental Health Weekly Digest 46, 124 (2015). http://search.proquest.com/docview/1647129310?accountid=136549CrossRefGoogle Scholar
- 5.Ekman, P.: Universal-Facial-Expressions-of-Emotion (1970)Google Scholar
- 6.Ekman, P.: Facial expression and emotion. Am. Psychol. 48(4), 384–392 (1993). https://doi.org/10.1037/0003-066X.48.4.384. http://doi.apa.org/getdoi.cfm?doi=10.1037/0003-066X.48.4.384CrossRefGoogle Scholar
- 7.Huang, S.T.Y., Kwan, C.M.Y., Sano, A.: The moment. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp 2014 Adjunct, pp. 235–238 (2014). https://doi.org/10.1145/2638728.2638784. http://dl.acm.org/citation.cfm?doid=2638728.2638784
- 10.Lagotte, A.: Eliciting discrete positive emotions with vignettes and films: a validation study (2014). http://etd.library.vanderbilt.edu/available/etd-07172014-202956/unrestricted/LagotteAE_MastersThesis.pdf
- 13.Pollak, J., Adams, P., Gay, G.: PAM: a photographic affect meter for frequent, in situ measurement of affect. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 725–734 (2011). https://doi.org/10.1145/1978942.1979047
- 14.Rodriguez, I., Herskovic, V., Fuentes, C., Campos, M.: B-ePain: a wearable interface to self-report pain and emotions. In: UbiComp Adjunct, pp. 1120–1125 (2016). https://doi.org/10.1145/2968219.2972719. http://dblp.uni-trier.de/db/conf/huc/ubicomp2016ap.html#RodriguezHFC16
- 15.Sanches, P., Hook, K., Sas, C., Stahl, A.: Ambiguity as a resource to inform proto-practices: the case of skin conductance. TOCHI 26(1) (2019). http://eprints.lancs.ac.uk/131794/1/TOCHI_Ambiguity.pdfCrossRefGoogle Scholar
- 16.Sanches, P., et al.: HCI and affective health taking stock of a decade of studies and charting future research directions, pp. 123–4567 (2019)Google Scholar
- 17.Sas, C., Rees, M.: AffectCam: arousal – augmented SenseCam for RicherRecall of episodic memories. In: CHI 2013, pp. 1041–1046 (2013). https://doi.org/10.1145/2468356.2468542. http://dl.acm.org/citation.cfm?doid=2468356.2468542
- 19.Umair, M., Latif, M.H., Sas, C.: Dynamic displays at wrist for real time visualization of affective data, pp. 201–205 (2018). https://doi.org/10.1145/3197391.3205436