Emotion recognition in response to traditional and tactile enhanced multimedia using electroencephalography


The goal of this study is to enhance the emotional experience of a viewer by using enriched multimedia content, which entices tactile sensation in addition to vision and auditory senses. A user-independent method of emotion recognition using electroencephalography (EEG) in response to tactile enhanced multimedia (TEM) is presented with an aim of enriching the human experience of viewing digital content. The selected traditional multimedia clips are converted into TEM clips by synchronizing them with an electric fan and a heater to add cold and hot air effect. This would give realistic feel to a viewer by engaging three human senses including vision, auditory, and tactile. The EEG data is recorded from 21 participants in response to traditional multimedia clips and their TEM versions. Self assessment manikin (SAM) scale is used to collect valence and arousal score in response to each clip to validate the evoked emotions. A t-test is applied on the valence and arousal values to measure any significant difference between multimedia and TEM clips. The resulting p-values show that traditional multimedia and TEM content are significantly different in terms of valence and arousal scores, which shows TEM clips have enhanced evoked emotions. For emotion recognition, twelve time domain features are extracted from the preprocessed EEG signal and a support vector machine is applied to classify four human emotions i.e., happy, angry, sad, and relaxed. An accuracy of 43.90% and 63.41% against traditional multimedia and TEM clips is achieved respectively, which shows that EEG based emotion recognition performs better by engaging tactile sense.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7


  1. 1.

    Aftanas L, Reva N, Varlamov A, Pavlov S, Makhnev V (2004) Analysis of evoked eeg synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics. Neurosci Behav Physiol 34(8):859–867

    Article  Google Scholar 

  2. 2.

    Alarcao SM, Fonseca MJ Emotions recognition using eeg signals: a survey. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2017.2714671

  3. 3.

    Allen JJ, Coan JA, Nazarian M (2004) Issues and assumptions on the road from raw signals to metrics of frontal eeg asymmetry in emotion. Biol Psychol 67(1):183–218

    Article  Google Scholar 

  4. 4.

    Anagnostopoulos C-N, Iliou T, Giannoukos I (2015) Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011. Artif Intell Rev 43 (2):155–177

    Article  Google Scholar 

  5. 5.

    Balconi M, Lucchiari C (2008) Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis. Int J Psychophysiol 67(1):41–46

    Article  Google Scholar 

  6. 6.

    Basar E, Basar-Eroglu C, Karakas S, Schurmann M (1999) Oscillatory brain theory: a new trend in neuroscience. IEEE Eng Med Biol Mag 18(3):56–66

    Article  Google Scholar 

  7. 7.

    Bethel CL, Salomon K, Murphy RR, Burke JL (2007) Survey of psychophysiology measurements applied to human-robot interaction. In: The 16th IEEE international symposium on robot and human interactive communication, 2007. RO-MAN 2007. IEEE, pp 732–737

  8. 8.

    Bhatti AM, Majid M, Anwar SM, Khan B (2016) Human emotion recognition and analysis in response to audio music using brain signals. Comput Hum Behav 65:267–275

    Article  Google Scholar 

  9. 9.

    Chanel G, Rebetez C, Bétrancourt M, Pun T (2011) Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans Syst Man Cybern Part A Syst Hum 41(6):1052–1063

    Article  Google Scholar 

  10. 10.

    Cohen J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46

    Article  Google Scholar 

  11. 11.

    de Gelder B, De Borst A, Watson R (2015) The perception of emotion in body expressions. Wiley Interdiscip Rev Cogn Sci 6(2):149–158

    Article  Google Scholar 

  12. 12.

    Du Y, Zhang F, Wang Y, Bi T, Qiu J (2016) Perceptual learning of facial expressions. Vision Res 128:19–29

    Article  Google Scholar 

  13. 13.

    Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recogn 36(1):259–275

    Article  MATH  Google Scholar 

  14. 14.

    Gao Y, Lee HJ, Mehmood RM (2015) Deep learninig of eeg signals for emotion recognition. In: 2015 IEEE international conference on multimedia & expo workshops (ICMEW). IEEE, pp 1–5

  15. 15.

    Ghinea G, Timmerer C, Lin W, Gulliver SR (2014) Mulsemedia: state of the art, perspectives, and challenges. ACM Trans Multimed Comput Commun Appl (TOMM) 11 (1s) 17:1–23

    Google Scholar 

  16. 16.

    Graziotin D, Wang X, Abrahamsson P (2015) Understanding the affect of developers: theoretical background and guidelines for psychoempirical software engineering. In: Proceedings of the 7th international workshop on social software engineering. ACM, pp 25–32

  17. 17.

    Heller W (1993) Neuropsychological mechanisms of individual differences in emotion, personality, and arousal. Neuropsychology 7(4):476–489

    Article  Google Scholar 

  18. 18.

    Jalilifard A, Pizzolato EB, Islam MK (2016) Emotion classification using single-channel scalp-eeg recording. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 845–849

  19. 19.

    Kalsum T, Anwar SM, Majid M, Khan B, Ali SM (2018) Emotion recognition from facial expressions using hybrid feature descriptors. IET Image Process 12(6):1004–1012

    Article  Google Scholar 

  20. 20.

    Kim J, André E (2006) Emotion recognition using physiological and speech signal in short-term observation. In: International tutorial and research workshop on perception and interactive technologies for speech-based systems. Springer, pp 53–64

  21. 21.

    Kim J, André E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30(12):2067–2083

    Article  Google Scholar 

  22. 22.

    Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427

    Article  Google Scholar 

  23. 23.

    Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  24. 24.

    Kroupi E, Vesin J-M, Ebrahimi T (2016) Subject-independent odor pleasantness classification using brain and peripheral signals. IEEE Trans Affect Comput 7(4):422–434

    Article  Google Scholar 

  25. 25.

    Lahane P, Sangaiah AK (2015) An approach to eeg based emotion recognition and classification using kernel density estimation. Procedia Computer Science 48:574–581

    Article  Google Scholar 

  26. 26.

    Lin H, Schulz C, Straube T (2016) Effects of expectation congruency on event-related potentials (erps) to facial expressions depend on cognitive load during the expectation phase. Biol Psychol 120:126–136

    Article  Google Scholar 

  27. 27.

    Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann JR, Chen JH (2010) Eeg-based emotion recognition in music listening. IEEE Trans Biomed Eng 57(7):1798–1806

    Article  Google Scholar 

  28. 28.

    Liu S, Tong J, Xu M, Yang J, Qi H, Ming D (2016) Improve the generalization of emotional classifiers across time by using training samples from different days. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 841–844

  29. 29.

    Matlovič T (2016) Emotion detection using epoc eeg device. In: Information and informatics technologies student research conference (IIT.SRC), pp 1–6

  30. 30.

    Murray N, Qiao Y, Lee B, Karunakar A, Muntean G-M (2013) Subjective evaluation of olfactory and visual media synchronization. In: Proceedings of the 4th ACM multimedia systems conference. ACM, pp 162–171

  31. 31.

    Murray N, Qiao Y, Lee B, Muntean G-M (2014) User-profile-based perceived olfactory and visual media synchronization. ACM Trans Multimed Comput Commun Appl (TOMM) 10 (1s) 11:1–24

    Article  Google Scholar 

  32. 32.

    Pan J, Li Y, Wang J (2016) An eeg-based brain-computer interface for emotion recognition. In: 2016 international joint conference on neural networks (IJCNN). IEEE, pp 2063–2067

  33. 33.

    Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23 (10):1175–1191

    Article  Google Scholar 

  34. 34.

    Qayyum H, Majid M, Anwar SM, Khan B (2017) Facial expression recognition using stationary wavelet transform features. Math Probl Eng 2017:9

    Article  Google Scholar 

  35. 35.

    Sanei S, Chambers JA (2007) EEG Signal processing. Wiley Online Library

  36. 36.

    Sarlo M, Buodo G, Poli S, Palomba D (2005) Changes in eeg alpha power to different disgust elicitors: the specificity of mutilations. Neurosci Lett 382(3):291–296

    Article  Google Scholar 

  37. 37.

    Şen B, Peker M, Çavuşoğlu A, Çelebi FV (2014) A comparative study on classification of sleep stage based on eeg signals using feature selection and classification algorithms. J Med Syst 38(18):1–21

    Google Scholar 

  38. 38.

    Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (eeg) distinguishes valence and intensity of musical emotions. Cognit Emot 15(4):487–500

    Article  Google Scholar 

  39. 39.

    Schutter DJ, Putman P, Hermans E, van Honk J (2001) Parietal electroencephalogram beta asymmetry and selective attention to angry facial expressions in healthy human subjects. Neurosci Lett 314(1):13–16

    Article  Google Scholar 

  40. 40.

    Singh H, Bauer M, Chowanski W, Sui Y, Atkinson D, Baurley S, Fry M, Evans J, Bianchi-Berthouze N (2014) The brain’s response to pleasant touch: an eeg investigation of tactile caressing. Front Hum Neurosci 8:893

    Article  Google Scholar 

  41. 41.

    Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223

    Article  Google Scholar 

  42. 42.

    Subramanian R, Wache J, Abadi MK, Vieriu RL, Winkler S, Sebe N (2018) Ascertain: emotion and personality recognition using commercial sensors. IEEE Trans Affect Comput 9(2):147–160

    Article  Google Scholar 

  43. 43.

    Sulema Y (2016) Mulsemedia vs. multimedia: state of the art and future trends. In: 2016 international conference on systems, signals and image processing (IWSSIP). IEEE, pp 1–5

  44. 44.

    Yuan Z, Chen S, Ghinea G, Muntean G-M (2014) User quality of experience of mulsemedia applications. ACM Trans Multimed Comput Commun Appl (TOMM) 11 15(1s):1–19

    Google Scholar 

  45. 45.

    Zhuang N, Zeng Y, Tong L, Zhang C, Zhang H, Yan B (2017) Emotion recognition from eeg signals using multidimensional information in emd domain. Biomed Res Int 2017:9

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Muhammad Majid.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Raheel, A., Anwar, S.M. & Majid, M. Emotion recognition in response to traditional and tactile enhanced multimedia using electroencephalography. Multimed Tools Appl 78, 13971–13985 (2019). https://doi.org/10.1007/s11042-018-6907-3

Download citation


  • Emotion recognition
  • Multimedia
  • Tactile enhanced multimedia
  • Classification
  • Electroencephalography