Advertisement

“Am I Talking to a Human or a Robot?”: A Preliminary Study of Human’s Perception in Human-Humanoid Interaction and Its Effects in Cognitive and Emotional States

  • Evangelia BakaEmail author
  • Ajay Vishwanath
  • Nidhi Mishra
  • Georgios Vleioras
  • Nadia Magnenat Thalmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11542)

Abstract

The current preliminary study concerns the identification of the effects human-humanoid interaction can have on human emotional states and behaviors, through a physical interaction. Thus, we have used three cases where people face three different types of physical interaction with a neutral person, Nadine social robot and the person on which Nadine was modelled, Professor Nadia Thalmann. To support our research, we have used EEG recordings to capture the physiological signals derived from the brain during each interaction, audio recordings to compare speech features and a questionnaire to provide psychometric data that can complement the above. Our results mainly showed the existence of frontal theta oscillations while interacting with the humanoid that probably shows the higher cognitive effort of the participants, as well as differences in the occipital area of the brain and thus, the visual attention mechanisms. The level of concentration and motivation of participants while interacting with the robot were higher indicating also higher amount of interest. The outcome of this experiment can broaden the field of human-robot interaction, leading to more efficient, meaningful and natural human-robot interaction.

Keywords

Human-robot interaction EEG Speech Social robots Social cognition Emotional communication 

Notes

Acknowledgments

This research is partly supported by the BeingTogether Centre, a collaboration between Nanyang Technological University (NTU) Singapore and University of North Carolina (UNC) at Chapel Hill. The BeingTogether Centre is supported by the National Research Foundation, Prime Ministers Office, Singapore under its International Research Centers in Singapore Funding Initiative. This research is partly supported by the European Commission through the project MINGEI.

References

  1. 1.
    Ekman, P.: Darwin and Facial Expressions. Academic Press, New York (1973)Google Scholar
  2. 2.
    Wang, Y., Quadflieg, S.: In our own image? Emotional and neural processing differences when observing human-human vs human-robot interactions. Soc. Cogn. Affect. Neurosci. 10(11), 1515–1524 (2014).  https://doi.org/10.1093/scan/nsv043CrossRefGoogle Scholar
  3. 3.
    Perez-Gaspar, L.A., Caballero-Morales, S.O., Trujillo-Romero, F.: Multimodal emotion recognition with evolutionary computation for human-robot interaction. Expert Syst. Appl. 66, 42–61 (2016).  https://doi.org/10.1016/j.eswa.2016.08.047CrossRefGoogle Scholar
  4. 4.
    Urgen, B.A., Plank, M., Ishiguro, H., Poizner, H., Saygin, A.P.: EEG theta and Mu oscillations during perception of human and robot actions. Front. Neurorobot. 7, 1–13 (2013)CrossRefGoogle Scholar
  5. 5.
    Esposito, A., Esposito, A.M., Vogel, C.: Needs and challenges in human computer interaction for processing social emotional information. Pattern Recogn. Lett. 66, 41–51 (2015)CrossRefGoogle Scholar
  6. 6.
    Nakisa, B., Rastgoo, M.N., Tjondronegoro, D., Chandran, V.: Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 93, 143–155 (2018).  https://doi.org/10.1016/j.eswa.2017.09.062CrossRefGoogle Scholar
  7. 7.
    Poria, S., Cambria, E., Hussain, A., Huang, G.B.: Towards an intelligent framework for multimodal affective data analysis. Neural Netw. 63, 104–116 (2015)CrossRefGoogle Scholar
  8. 8.
    Zheng, W.-L., Dong, B.-N., Lu, B.-L.: Multimodal emotion recognition using EEG and eye tracking data. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5040–5043 (2014).  https://doi.org/10.1109/EMBC.2014.6944757
  9. 9.
    Jatupaiboon, N., Pan-Ngum, S., Israsena, P.: Emotion classification using minimal EEG channels and frequency bands. In: 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), pp. 21–24 (2013).  https://doi.org/10.1109/JCSSE.2013.6567313
  10. 10.
    Wan Ismail, W.O.A.S., Hanif, M., Mohamed, S.B., Hamzah, N., Rizman, Z.I.: Human emotion detection via brain waves study by using electroencephalogram (EEG). Int. J. Adv. Sci. Eng. Inf. Technol. 6(6), 1005 (2016)CrossRefGoogle Scholar
  11. 11.
    Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5(3), 327–339 (2014)CrossRefGoogle Scholar
  12. 12.
    Thammasan, N., Moriyama, K., Fukui, K., Numao, M.: Familiarity effects in EEG-based emotion recognition. Brain Inform. 4(1), 39–50 (2017).  https://doi.org/10.1007/s40708-016-0051-5CrossRefGoogle Scholar
  13. 13.
    de Borst, A., de Gelder, B.: Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective. Front. Psychol. 6, 576 (2015)CrossRefGoogle Scholar
  14. 14.
    Cheetham, M., Pavlovic, I., Jordan, N., Suter, P., Jancke, L.: Category processing and the human likeness dimension of the uncanny valley hypothesis: eye tracking data. Front. Psychol. 4, 108 (2013)CrossRefGoogle Scholar
  15. 15.
    Cheetham, M., Suter, P., Jäncke, L.: The human likeness dimension of the uncanny valley hypothesis: behavioral and functional MFR findings. Front. Hum. Neurosci. 5, 126 (2011)CrossRefGoogle Scholar
  16. 16.
    Moser, E., Derntl, B., Robinson, S., et al.: Amygdala activation at 3T in response to human and avatar facial expressions of emotions. J. Neurosci. Methods 161, 126–133 (2007)CrossRefGoogle Scholar
  17. 17.
    Chaminade, T., Cheng, G.: Social cognitive neuroscience and humanoid robotics. J. Physiol. 103, 286–295 (2009)Google Scholar
  18. 18.
    Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum. Comput. Interact. 59, 119–155 (2003)Google Scholar
  19. 19.
    Lisetti, C.L., Nasoz, F., Lerouge, C., Ozyer, O., Alvarez, K.: Developing multimodal intelligent affective interfaces for tele-home health care. Int. J. Hum.-Comput. Stud. Spec. Issue Appl. Affect. Comput. HCI 59(1–2), 245–255 (2003)CrossRefGoogle Scholar
  20. 20.
    Babiloni, C., Del Percio, C., Vecchio, F., et al.: Alpha, beta and gamma electrocorticographic rhythms in somatosensory, motor, premotor and prefrontal cortical areas differ in movement execution and observation in humans. Clin. Neurophysiol. 127(1), 641–654 (2016)CrossRefGoogle Scholar
  21. 21.
    Cavanagh, J.F., Frank, M.J.: Frontal theta as a mechanism of cognitive control. Trends Cogn. Sci. 18(8), 414–421 (2014)CrossRefGoogle Scholar
  22. 22.
    Baka, E., Stavroulia, K.E., Magnenat-Thalmann, N., Lanitis, A.: An EEG-based evaluation for comparing the sense of presence between virtual and physical environments. In: Proceedings of Computer Graphics International (CGI 2018), 10 p. ACM, New York (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Evangelia Baka
    • 1
    Email author
  • Ajay Vishwanath
    • 2
  • Nidhi Mishra
    • 2
  • Georgios Vleioras
    • 3
  • Nadia Magnenat Thalmann
    • 1
    • 2
  1. 1.MIRALabUniversity of GenevaGenevaSwitzerland
  2. 2.Institute of Media InnovationNanyang Technological UniversitySingaporeSingapore
  3. 3.University of ThessalyVolosGreece

Personalised recommendations