On the Use of Lateralization for Lightweight and Accurate Methodology for EEG Real Time Emotion Estimation Using Gaussian-Process Classifier

  • Mikel Val-CalvoEmail author
  • José Ramón Álvarez-Sánchez
  • Alejandro Díaz-Morcillo
  • José Manuel Ferrández Vicente
  • Eduardo Fernández-Jover
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11486)


Emotional estimation systems based on electroencephalography (EEG) signals are gaining special attention in recent years due to the possibilities they offer. The field of human-robot interactions (HRI) will benefit from a broadened understanding of brain emotional encoding and thus, improve the capabilities of robots to fully engage with the user’s emotional reactions. In this paper, a methodology for real-time emotion estimation aimed for its use in the field of HRI is proposed. The proposed methodology takes advantage of the lateralization produced in brain oscillations during emotional stimuli and the use of meaningful features related to intrinsic EEG patterns. In the validation procedure, both DEAP and SEED databases have been used. A mean performance of 88.34% was obtained using four categories of the valence-arousal space, and 97.1% using three discrete categories; both of them obtained with a Gaussian-Process classifier. This lightweight method could run on inexpensive, portable devices such as the openBCI system.


Emotion estimation EEG Robotics Human-robot interaction 



We want to acknowledge to Programa de Ayudas a Grupos de Excelencia de la Región de Murcia, from Fundación Séneca, Agencia de Ciencia y Tecnología de la Región de Murcia.


  1. 1.
    Babyak, M.A.: What you see may not be what you get: a brief, nontechnical introduction to overfitting in regression-type models. Psychosom. Med. 66(3), 411–421 (2004)Google Scholar
  2. 2.
    Beyer, K., Goldstein, J., Ramakrishnan, R., Shaft, U.: When is “Nearest Neighbor” meaningful? In: Beeri, C., Buneman, P. (eds.) ICDT 1999. LNCS, vol. 1540, pp. 217–235. Springer, Heidelberg (1999). Scholar
  3. 3.
    Boashash, B.: Estimating and interpreting the instantaneous frequency of a signal. I. Fundamentals. Proc. IEEE 80(4), 520–538 (1992). Scholar
  4. 4.
    Candra, H., et al.: Investigation of window size in classification of EEG-emotion signal with wavelet entropy and support vector machine. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 7250–7253. IEEE (2015)Google Scholar
  5. 5.
    Chai, X., Wang, Q., Zhao, Y., Liu, X., Bai, O., Li, Y.: Unsupervised domain adaptation techniques based on auto-encoder for non-stationary EEG-based emotion recognition. Comput. Biol. Med. 79, 205–214 (2016)CrossRefGoogle Scholar
  6. 6.
    Fan, J., Li, R.: Statistical challenges with high dimensionality: feature selection in knowledge discovery. arXiv preprint math/0602133 (2006)Google Scholar
  7. 7.
    Fransen, A.M., van Ede, F., Maris, E.: Identifying neuronal oscillations using rhythmicity. Neuroimage 118, 256–267 (2015)CrossRefGoogle Scholar
  8. 8.
    Frey, J.: Comparison of an open-hardware electroencephalography amplifier with medical grade device in brain-computer interface applications. arXiv preprint arXiv:1606.02438 (2016)
  9. 9.
    Jukiewicz, M., Cysewska-Sobusiak, A.: Stimuli design for SSVEP-based brain computer-interface. Int. J. Electron. Telecommun. 62(2), 109–113 (2016)CrossRefGoogle Scholar
  10. 10.
    Khosrowabadi, R., Quek, C., Ang, K.K., Wahab, A.: ERNN: a biologically inspired feedforward neural network to discriminate emotion from EEG signal. IEEE Trans. Neural Netw. Learn. Syst. 25(3), 609–620 (2014)CrossRefGoogle Scholar
  11. 11.
    Köppen, M.: The curse of dimensionality. In: 5th Online World Conference on Soft Computing in Industrial Applications (WSC5), vol. 1, pp. 4–8 (2000)Google Scholar
  12. 12.
    Kragel, P.A., LaBar, K.S.: Decoding the nature of emotion in the brain. Trends Cogn. Sci. 20(6), 444–455 (2016)CrossRefGoogle Scholar
  13. 13.
    Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Song, T., Zheng, W., Song, P., Cui, Z.: EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput. (2018)Google Scholar
  15. 15.
    Tripathi, S., Acharya, S., Sharma, R.D., Mittal, S., Bhattacharya, S.: Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In: AAAI, pp. 4746–4752 (2017)Google Scholar
  16. 16.
    VoytekLab: Neuro digital signal processing toolbox (2018).
  17. 17.
    Zheng, W.L., Lu, B.L.: Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks. IEEE Trans. Autonom. Mental Dev. 7(3), 162–175 (2015)CrossRefGoogle Scholar
  18. 18.
    Zheng, W.L., Zhang, Y.Q., Zhu, J.Y., Lu, B.L.: Transfer components between subjects for EEG-based emotion recognition. In: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 917–922. IEEE (2015)Google Scholar
  19. 19.
    Zheng, W.L., Zhu, J.Y., Lu, B.L.: Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans. Affect. Comput. (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Mikel Val-Calvo
    • 1
    • 2
    Email author
  • José Ramón Álvarez-Sánchez
    • 1
  • Alejandro Díaz-Morcillo
    • 3
  • José Manuel Ferrández Vicente
    • 2
  • Eduardo Fernández-Jover
    • 4
  1. 1.Dpto. de Inteligencia ArtificialUniversidad Nacional de Educacióna Distancia (UNED)MadridSpain
  2. 2.Dpto. Electrónica, Tecnología de Computadoras y ProyectosUniv. Politécnica de CartagenaCartagenaSpain
  3. 3.Dpto. Tecnologías de la Información y las ComunicacionesUniv. Politécnica de CartagenaCartagenaSpain
  4. 4.Instituto de BioingenieríaUniv. Miguel HernándezElcheSpain

Personalised recommendations