Advertisement

Multimedia Tools and Applications

, Volume 78, Issue 10, pp 13971–13985 | Cite as

Emotion recognition in response to traditional and tactile enhanced multimedia using electroencephalography

  • Aasim Raheel
  • Syed Muhammad Anwar
  • Muhammad MajidEmail author
Article

Abstract

The goal of this study is to enhance the emotional experience of a viewer by using enriched multimedia content, which entices tactile sensation in addition to vision and auditory senses. A user-independent method of emotion recognition using electroencephalography (EEG) in response to tactile enhanced multimedia (TEM) is presented with an aim of enriching the human experience of viewing digital content. The selected traditional multimedia clips are converted into TEM clips by synchronizing them with an electric fan and a heater to add cold and hot air effect. This would give realistic feel to a viewer by engaging three human senses including vision, auditory, and tactile. The EEG data is recorded from 21 participants in response to traditional multimedia clips and their TEM versions. Self assessment manikin (SAM) scale is used to collect valence and arousal score in response to each clip to validate the evoked emotions. A t-test is applied on the valence and arousal values to measure any significant difference between multimedia and TEM clips. The resulting p-values show that traditional multimedia and TEM content are significantly different in terms of valence and arousal scores, which shows TEM clips have enhanced evoked emotions. For emotion recognition, twelve time domain features are extracted from the preprocessed EEG signal and a support vector machine is applied to classify four human emotions i.e., happy, angry, sad, and relaxed. An accuracy of 43.90% and 63.41% against traditional multimedia and TEM clips is achieved respectively, which shows that EEG based emotion recognition performs better by engaging tactile sense.

Keywords

Emotion recognition Multimedia Tactile enhanced multimedia Classification Electroencephalography 

Notes

References

  1. 1.
    Aftanas L, Reva N, Varlamov A, Pavlov S, Makhnev V (2004) Analysis of evoked eeg synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics. Neurosci Behav Physiol 34(8):859–867CrossRefGoogle Scholar
  2. 2.
    Alarcao SM, Fonseca MJ Emotions recognition using eeg signals: a survey. IEEE Trans Affect Comput.  https://doi.org/10.1109/TAFFC.2017.2714671
  3. 3.
    Allen JJ, Coan JA, Nazarian M (2004) Issues and assumptions on the road from raw signals to metrics of frontal eeg asymmetry in emotion. Biol Psychol 67(1):183–218CrossRefGoogle Scholar
  4. 4.
    Anagnostopoulos C-N, Iliou T, Giannoukos I (2015) Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011. Artif Intell Rev 43 (2):155–177CrossRefGoogle Scholar
  5. 5.
    Balconi M, Lucchiari C (2008) Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis. Int J Psychophysiol 67(1):41–46CrossRefGoogle Scholar
  6. 6.
    Basar E, Basar-Eroglu C, Karakas S, Schurmann M (1999) Oscillatory brain theory: a new trend in neuroscience. IEEE Eng Med Biol Mag 18(3):56–66CrossRefGoogle Scholar
  7. 7.
    Bethel CL, Salomon K, Murphy RR, Burke JL (2007) Survey of psychophysiology measurements applied to human-robot interaction. In: The 16th IEEE international symposium on robot and human interactive communication, 2007. RO-MAN 2007. IEEE, pp 732–737Google Scholar
  8. 8.
    Bhatti AM, Majid M, Anwar SM, Khan B (2016) Human emotion recognition and analysis in response to audio music using brain signals. Comput Hum Behav 65:267–275CrossRefGoogle Scholar
  9. 9.
    Chanel G, Rebetez C, Bétrancourt M, Pun T (2011) Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans Syst Man Cybern Part A Syst Hum 41(6):1052–1063CrossRefGoogle Scholar
  10. 10.
    Cohen J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46CrossRefGoogle Scholar
  11. 11.
    de Gelder B, De Borst A, Watson R (2015) The perception of emotion in body expressions. Wiley Interdiscip Rev Cogn Sci 6(2):149–158CrossRefGoogle Scholar
  12. 12.
    Du Y, Zhang F, Wang Y, Bi T, Qiu J (2016) Perceptual learning of facial expressions. Vision Res 128:19–29CrossRefGoogle Scholar
  13. 13.
    Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recogn 36(1):259–275CrossRefzbMATHGoogle Scholar
  14. 14.
    Gao Y, Lee HJ, Mehmood RM (2015) Deep learninig of eeg signals for emotion recognition. In: 2015 IEEE international conference on multimedia & expo workshops (ICMEW). IEEE, pp 1–5Google Scholar
  15. 15.
    Ghinea G, Timmerer C, Lin W, Gulliver SR (2014) Mulsemedia: state of the art, perspectives, and challenges. ACM Trans Multimed Comput Commun Appl (TOMM) 11 (1s) 17:1–23Google Scholar
  16. 16.
    Graziotin D, Wang X, Abrahamsson P (2015) Understanding the affect of developers: theoretical background and guidelines for psychoempirical software engineering. In: Proceedings of the 7th international workshop on social software engineering. ACM, pp 25–32Google Scholar
  17. 17.
    Heller W (1993) Neuropsychological mechanisms of individual differences in emotion, personality, and arousal. Neuropsychology 7(4):476–489CrossRefGoogle Scholar
  18. 18.
    Jalilifard A, Pizzolato EB, Islam MK (2016) Emotion classification using single-channel scalp-eeg recording. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 845–849Google Scholar
  19. 19.
    Kalsum T, Anwar SM, Majid M, Khan B, Ali SM (2018) Emotion recognition from facial expressions using hybrid feature descriptors. IET Image Process 12(6):1004–1012CrossRefGoogle Scholar
  20. 20.
    Kim J, André E (2006) Emotion recognition using physiological and speech signal in short-term observation. In: International tutorial and research workshop on perception and interactive technologies for speech-based systems. Springer, pp 53–64Google Scholar
  21. 21.
    Kim J, André E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30(12):2067–2083CrossRefGoogle Scholar
  22. 22.
    Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427CrossRefGoogle Scholar
  23. 23.
    Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31CrossRefGoogle Scholar
  24. 24.
    Kroupi E, Vesin J-M, Ebrahimi T (2016) Subject-independent odor pleasantness classification using brain and peripheral signals. IEEE Trans Affect Comput 7(4):422–434CrossRefGoogle Scholar
  25. 25.
    Lahane P, Sangaiah AK (2015) An approach to eeg based emotion recognition and classification using kernel density estimation. Procedia Computer Science 48:574–581CrossRefGoogle Scholar
  26. 26.
    Lin H, Schulz C, Straube T (2016) Effects of expectation congruency on event-related potentials (erps) to facial expressions depend on cognitive load during the expectation phase. Biol Psychol 120:126–136CrossRefGoogle Scholar
  27. 27.
    Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann JR, Chen JH (2010) Eeg-based emotion recognition in music listening. IEEE Trans Biomed Eng 57(7):1798–1806CrossRefGoogle Scholar
  28. 28.
    Liu S, Tong J, Xu M, Yang J, Qi H, Ming D (2016) Improve the generalization of emotional classifiers across time by using training samples from different days. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 841–844Google Scholar
  29. 29.
    Matlovič T (2016) Emotion detection using epoc eeg device. In: Information and informatics technologies student research conference (IIT.SRC), pp 1–6Google Scholar
  30. 30.
    Murray N, Qiao Y, Lee B, Karunakar A, Muntean G-M (2013) Subjective evaluation of olfactory and visual media synchronization. In: Proceedings of the 4th ACM multimedia systems conference. ACM, pp 162–171Google Scholar
  31. 31.
    Murray N, Qiao Y, Lee B, Muntean G-M (2014) User-profile-based perceived olfactory and visual media synchronization. ACM Trans Multimed Comput Commun Appl (TOMM) 10 (1s) 11:1–24CrossRefGoogle Scholar
  32. 32.
    Pan J, Li Y, Wang J (2016) An eeg-based brain-computer interface for emotion recognition. In: 2016 international joint conference on neural networks (IJCNN). IEEE, pp 2063–2067Google Scholar
  33. 33.
    Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23 (10):1175–1191CrossRefGoogle Scholar
  34. 34.
    Qayyum H, Majid M, Anwar SM, Khan B (2017) Facial expression recognition using stationary wavelet transform features. Math Probl Eng 2017:9CrossRefGoogle Scholar
  35. 35.
    Sanei S, Chambers JA (2007) EEG Signal processing. Wiley Online LibraryGoogle Scholar
  36. 36.
    Sarlo M, Buodo G, Poli S, Palomba D (2005) Changes in eeg alpha power to different disgust elicitors: the specificity of mutilations. Neurosci Lett 382(3):291–296CrossRefGoogle Scholar
  37. 37.
    Şen B, Peker M, Çavuşoğlu A, Çelebi FV (2014) A comparative study on classification of sleep stage based on eeg signals using feature selection and classification algorithms. J Med Syst 38(18):1–21Google Scholar
  38. 38.
    Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (eeg) distinguishes valence and intensity of musical emotions. Cognit Emot 15(4):487–500CrossRefGoogle Scholar
  39. 39.
    Schutter DJ, Putman P, Hermans E, van Honk J (2001) Parietal electroencephalogram beta asymmetry and selective attention to angry facial expressions in healthy human subjects. Neurosci Lett 314(1):13–16CrossRefGoogle Scholar
  40. 40.
    Singh H, Bauer M, Chowanski W, Sui Y, Atkinson D, Baurley S, Fry M, Evans J, Bianchi-Berthouze N (2014) The brain’s response to pleasant touch: an eeg investigation of tactile caressing. Front Hum Neurosci 8:893CrossRefGoogle Scholar
  41. 41.
    Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223CrossRefGoogle Scholar
  42. 42.
    Subramanian R, Wache J, Abadi MK, Vieriu RL, Winkler S, Sebe N (2018) Ascertain: emotion and personality recognition using commercial sensors. IEEE Trans Affect Comput 9(2):147–160CrossRefGoogle Scholar
  43. 43.
    Sulema Y (2016) Mulsemedia vs. multimedia: state of the art and future trends. In: 2016 international conference on systems, signals and image processing (IWSSIP). IEEE, pp 1–5Google Scholar
  44. 44.
    Yuan Z, Chen S, Ghinea G, Muntean G-M (2014) User quality of experience of mulsemedia applications. ACM Trans Multimed Comput Commun Appl (TOMM) 11 15(1s):1–19Google Scholar
  45. 45.
    Zhuang N, Zeng Y, Tong L, Zhang C, Zhang H, Yan B (2017) Emotion recognition from eeg signals using multidimensional information in emd domain. Biomed Res Int 2017:9CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Computer EngineeringUniversity of Engineering and TechnologyTaxilaPakistan

Personalised recommendations