“Am I Talking to a Human or a Robot?”: A Preliminary Study of Human’s Perception in Human-Humanoid Interaction and Its Effects in Cognitive and Emotional States
The current preliminary study concerns the identification of the effects human-humanoid interaction can have on human emotional states and behaviors, through a physical interaction. Thus, we have used three cases where people face three different types of physical interaction with a neutral person, Nadine social robot and the person on which Nadine was modelled, Professor Nadia Thalmann. To support our research, we have used EEG recordings to capture the physiological signals derived from the brain during each interaction, audio recordings to compare speech features and a questionnaire to provide psychometric data that can complement the above. Our results mainly showed the existence of frontal theta oscillations while interacting with the humanoid that probably shows the higher cognitive effort of the participants, as well as differences in the occipital area of the brain and thus, the visual attention mechanisms. The level of concentration and motivation of participants while interacting with the robot were higher indicating also higher amount of interest. The outcome of this experiment can broaden the field of human-robot interaction, leading to more efficient, meaningful and natural human-robot interaction.
KeywordsHuman-robot interaction EEG Speech Social robots Social cognition Emotional communication
This research is partly supported by the BeingTogether Centre, a collaboration between Nanyang Technological University (NTU) Singapore and University of North Carolina (UNC) at Chapel Hill. The BeingTogether Centre is supported by the National Research Foundation, Prime Ministers Office, Singapore under its International Research Centers in Singapore Funding Initiative. This research is partly supported by the European Commission through the project MINGEI.
- 1.Ekman, P.: Darwin and Facial Expressions. Academic Press, New York (1973)Google Scholar
- 8.Zheng, W.-L., Dong, B.-N., Lu, B.-L.: Multimodal emotion recognition using EEG and eye tracking data. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5040–5043 (2014). https://doi.org/10.1109/EMBC.2014.6944757
- 9.Jatupaiboon, N., Pan-Ngum, S., Israsena, P.: Emotion classification using minimal EEG channels and frequency bands. In: 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), pp. 21–24 (2013). https://doi.org/10.1109/JCSSE.2013.6567313
- 17.Chaminade, T., Cheng, G.: Social cognitive neuroscience and humanoid robotics. J. Physiol. 103, 286–295 (2009)Google Scholar
- 18.Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum. Comput. Interact. 59, 119–155 (2003)Google Scholar
- 22.Baka, E., Stavroulia, K.E., Magnenat-Thalmann, N., Lanitis, A.: An EEG-based evaluation for comparing the sense of presence between virtual and physical environments. In: Proceedings of Computer Graphics International (CGI 2018), 10 p. ACM, New York (2018)Google Scholar