Emotion recognition for semi-autonomous vehicles framework

  • Javier Izquierdo-Reyes
  • Ricardo A. Ramirez-Mendoza
  • Martin R. Bustamante-Bello
  • Jose L. Pons-Rovira
  • Jose E. Gonzalez-Vargas
Technical Paper
  • 38 Downloads

Abstract

The human being in his blessed curiosity has always wondered how to make machines feel, and, at the same time how a machine can detect emotions. Perhaps some of the tasks that cannot be replaced by machines are the ability of human beings to feel emotions. In the last year, this hypothesis is increasingly questioned by scientists who have done work that seeks to understand the phenomena of brain functioning using the state of the art in instrumentation, sensors, and signal processing. Today, the world scientists have powerful machine learning methods developed to challenge this issue.The field of emotion detection is gaining significance as the technology advances, and particularly due to the current developments in machine learning, the Internet of Things, industry 4.0 and Autonomous Vehicles. Machines will need to be equipped with the capacity to monitor the state of the human user and to change their behaviour in response. Machine learning offers a route to this and should be able to make use of data collected from questionnaires, facial expression scans, and physiological signals such as electroencephalograms (EEG), electrocardiograms, and galvanic skin response. In this study, an approach was proposed to identify the emotional state of a subject from the collected data in the elicited emotion experiments. An algorithm using EEG data was developed, using the power spectral density of the frequency cerebral bands (alpha, beta, theta, and gamma) as features for classifier training. A K Nearest Neighbors algorithm using Euclidian distance was used to predict the emotional state of the subject. This article proposes a novel approach for emotion recognition that not only depends on images of the face, as in the previous literature, but also on the physiological data. The algorithm was able to recognize nine different emotions (Neutral, Anger, Disgust, Fear, Joy, Sadness, Surprise, Amusement, and Anxiety), nine valence positions, and nine positions on arousal axes. Using the data from only 14 EEG electrodes, an accuracy of approximately 97% was achieved. An approach has been developed for evaluating the state of mind of an driver in the context of a semi-autonomous vehicle context, for example. However, the system has a much wider range of potential applications, from the design of products to the evaluation of the user experience.

Keywords

Electroencephalography Emotion recognition K nearest neighbor Autonomous vehicles Semi autonomous 

Notes

Acknowledgements

This research was supported by Tecnologico de Monterrey and Consejo Nacional de Ciencia y Tecnologia (CONACYT) Mexico, under scholarship 593255, We give special thanks to the Instituto Cajal for present and future collaboration.

References

  1. 1.
    Im, C.-H., Lee, J.-H., Lim, J.-H.: Neurocinematics based on passive BCI: decoding temporal change of emotional arousal during video watching from multi-channel EEG. In: 2015 10th Asian Control Conference (ASCC), pp. 1–3. IEEE (2015).  https://doi.org/10.1109/ASCC.2015.7244792. http://ieeexplore.ieee.org/document/7244792/
  2. 2.
    Chen, M., Han, J., Guo, L., Wang, J., Patras, I.: Identifying valence and arousal levels via connectivity between EEG channels. In: 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, pp. 63–69 (2015).  https://doi.org/10.1109/ACII.2015.7344552
  3. 3.
    Cheutet, V., Léon, J.C., Catalano, C.E., Giannini, F., Monti, M., Falcidieno, B.: Preserving car stylists design intent through an ontology. Int. J. Interact. Des. Manuf. (IJIDeM) 2(1), 9–16 (2008).  https://doi.org/10.1007/s12008-007-0031-3. http://link.springer.com/10.1007/s12008-007-0031-3
  4. 4.
    Garbas, J.U., Ruf, T., Mattias, U., Dieckmann, A.: Towards robust real-time valence recognition from facial expressions for market research applications. In: Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 570–575 (2013).  https://doi.org/10.1109/ACII.2013.100. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681491
  5. 5.
    Izquierdo-Reyes, J., Ramirez-Mendoza, R.A., Bustamante-Bello, M.R.: A study of the effects of advanced driver assistance systems alerts on driver performance. Int. J. Interact. Des. Manuf. (IJIDeM) (2017).  https://doi.org/10.1007/s12008-016-0368-6. http://link.springer.com/10.1007/s12008-016-0368-6
  6. 6.
    Izquierdo-Reyes, J., Ramirez-Mendoza, R.A., Bustamante-Bello, M.R., Navarro-Tuch, S., Avila-Vazquez, R.: Advanced driver monitoring for assistance system (ADMAS). Int. J. Interact. Des. Manuf. (IJIDeM) (2016).  https://doi.org/10.1007/s12008-016-0349-9. http://link.springer.com/10.1007/s12008-016-0349-9
  7. 7.
    Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci World J 2014, 1–10 (2014).  https://doi.org/10.1155/2014/627892. http://www.hindawi.com/journals/tswj/2014/627892/
  8. 8.
    Koelstra, S., Muhl, C., Soleymani, M.: Jong-Seok Lee, Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: DEAP: A Database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012).  https://doi.org/10.1109/T-AFFC.2011.15. http://www.eecs.qmul.ac.uk/mmv/datasets/deap/doc/tac_special_issue_2011.pdf ieeexplore.ieee.org/document/5871728/
  9. 9.
    Kolli, A., Fasih, A., Machot, F.A., Kyamakya, K.: Non-intrusive car driver s emotion recognition using thermal camera. In: 2011 Joint 3rd Int’l Workshop on Nonlinear Dynamics and Synchronization (INDS) & 16th Int’l Symposium on Theoretical Electrical Engineering (ISTET) (2011)Google Scholar
  10. 10.
    Kumar, J., Kumar, J.: Affective modelling of users in HCI using EEG. Procedia Comput. Sci. 84, 107–114 (2016).  https://doi.org/10.1016/j.procs.2016.04.073. http://dx.doi.org/10.1016/j.procs.2016.04.073
  11. 11.
    Lichtenauer, J., Soleymani, M.: Mahnob-Hci-Tagging Database. Tech. rep., London (2011). https://mahnob-db.eu/hci-tagging/media/uploads/manual.pdf
  12. 12.
    Navarro-Tuch, S.A., Bustamante-Bello, M.R., Molina, A., Izquierdo-Reyes, J., Avila-Vazquez, R., Pablos-Hach, J.L., Gutiérrez-Martínez, Y.: Inhabitable space control for the creation of healthy interactive spaces through emotional domotics. Int. J. Interact. Des. Manuf. (IJIDeM) (2017).  https://doi.org/10.1007/s12008-017-0410-3. http://link.springer.com/10.1007/s12008-017-0410-3
  13. 13.
    Petiot, J.F., Dagher, A.: Preference-oriented form design: application to cars headlights. Int. J. Interact. Des. Manuf. (IJIDeM) 5(1), 17–27 (2011).  https://doi.org/10.1007/s12008-010-0105-5. http://link.springer.com/10.1007/s12008-010-0105-5
  14. 14.
    Selvaraj, J., Murugappan, M., Wan, K., Yaacob, S.: Classification of emotional states from electrocardiogram signals: a non-linear approach based on Hurst. Biomed. Eng. Online 12(1), 44 (2013).  https://doi.org/10.1186/1475-925X-12-44. http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3680185&tool=pmcentrez&rendertype=abstract
  15. 15.
    Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A Multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012).  https://doi.org/10.1109/T-AFFC.2011.25. http://ieeexplore.ieee.org/document/5975141/
  16. 16.
    Solomon, O.M.: PSD Computations Using Welchs Method. Tech. Rep. December, Sandia National Laboratories (1991). https://www.osti.gov/scitech/servlets/purl/5688766/
  17. 17.
    Tivatansakul, S., Ohkura, M.: Emotion recognition using ECG signals with local pattern description methods. Int. J. Affect. Eng. 15(2), 51–61 (2016).  https://doi.org/10.5057/ijae.IJAE-D-15-00036. https://www.jstage.jst.go.jp/article/ijae/15/2/15_IJAE-D-15-00036/_article
  18. 18.
    Torres-Valencia, C., Álvarez-López, M., Orozco-Gutiérrez, l: SVM-based feature selection methods for emotion recognition from multimodal data. J. Multimodal User Interfaces 11(1), 9–23 (2017).  https://doi.org/10.1007/s12193-016-0222-y CrossRefGoogle Scholar
  19. 19.
    Wang, S., Liu, Z., Lv, S., Lv, Y., Wu, G., Peng, P., Chen, F., Wang, X.: A natural visible and infrared facial expression database for expression recognition and emotion inference. IEEE Trans. Multimed. 12(7), 682–691 (2010).  https://doi.org/10.1109/TMM.2010.2060716 CrossRefGoogle Scholar
  20. 20.
    Wang, S., Shen, P., Liu, Z.: Facial expression recognition from infrared thermal images using temperature difference by voting. In: Proceedings of IEEE CCIS2012, pp. 94–98 (2012)Google Scholar
  21. 21.
    Wu, G., Liu, G., Hao, M.: The analysis of emotion recognition from GSR based on PSO. In: Proceedings—2010 International Symposium on Intelligence Information Processing and Trusted Computing, IPTC 2010, pp. 360–363 (2010).  https://doi.org/10.1109/IPTC.2010.60
  22. 22.
    Xu, Y., Liu, G., Hao, M., Wen, W., Huang, X.: Analysis of affective ECG signals toward emotion recognition. J. Electron. (China) 27(1), 8–14 (2010).  https://doi.org/10.1007/s11767-009-0094-3. http://link.springer.com/10.1007/s11767-009-0094-3

Copyright information

© Springer-Verlag France SAS, part of Springer Nature 2018

Authors and Affiliations

  • Javier Izquierdo-Reyes
    • 1
    • 2
  • Ricardo A. Ramirez-Mendoza
    • 1
  • Martin R. Bustamante-Bello
    • 1
  • Jose L. Pons-Rovira
    • 2
  • Jose E. Gonzalez-Vargas
    • 2
  1. 1.School of Engineering and SciencesTecnologico de MonterreyMexico CityMexico
  2. 2.Cajal Institute Spanish National Research CouncilMadridSpain

Personalised recommendations