Comparison of User Responses to English and Arabic Emotion Elicitation Video Clips

  • Nawal Al-MutairiEmail author
  • Sharifa Alghowinem
  • Areej Al-Wabil
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9180)


To study the variation in emotional responses to stimuli, different methods have been developed to elicit emotions in a replicable way. Using video clips has been shown to be the most effective stimuli. However, the differences in cultural backgrounds lead to different emotional responses to the same stimuli. Therefore, we compared the emotional response to a commonly used emotion eliciting video clips from the Western culture on Saudi culture with an initial selection of emotion eliciting Arabic video clips. We analysed skin physiological signals in response to video clips from 29 Saudi participants. The results of the validated English video clips and the initial Arabic video clips are comparable, which suggest that a universal capability of the English set to elicit target emotions in Saudi sample, and that a refined selection of Arabic emotion elicitation clips would improve the capability of inducing the target emotions with higher levels of intensity.


Emotion classification Basic emotions Physiological signals Electro-dermal activity Skin temperature 



The authors extend their appreciation to the Deanship of Scientific Research at King Saud University for funding the work through the research group project number RGP-VPP-157.


  1. 1.
    Fragopanagos, N., Taylor, J.G.: Emotion recognition in human-computer interaction. Neural Networks 18(4), 389–405 (2005)CrossRefGoogle Scholar
  2. 2.
    Singh, R.R., Conjeti, S., Banerjee, R.: A comparative evaluation of neural network classifiers for stress level analysis of automotive drivers using physiological signals. Biomed. Signal Process. Control 8(6), 740–754 (2013)CrossRefGoogle Scholar
  3. 3.
    Alghowinem, S.: From joyous to clinically depressed: Mood detection using multimodal analysis of a person’s appearance and speech. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 648–654, September 2013Google Scholar
  4. 4.
    Jiang, L., Qing, Z., Wenyuan, W.: A novel approach to analyze the result of polygraph. In: 2000 IEEE International Conference on Systems, Man, and Cybernetics, Volume 4,pp. 2884–2886. IEEE (2000)Google Scholar
  5. 5.
    Park, S., Reddy, B.R., Suresh, A., Mani, M.R., Kumar, V.V., Sung, J.S., Anbuselvi, R., Bhuvaneswaran, R., Sattarova, F., Shavkat, S.Y., et al.: Electro-dermal activity, heart rate, respiration under emotional stimuli in schizophrenia. Int. J. Adv. Sci.Technol. 9, 1–8 (2009)Google Scholar
  6. 6.
    Westermann, R., Spies, K., Stahl, G., Hesse, F.W.: Relative effectiveness and validity of mood induction procedures: a meta-analysis. Eur. J. Soc. Psychol. 26(4), 557–580 (1996)CrossRefGoogle Scholar
  7. 7.
    Gross, J.J., Levenson, R.W.: Emotion elicitation using films. Cogn. Emot. 9(1), 87–108 (1995)CrossRefGoogle Scholar
  8. 8.
    Richerson, P.J., Boyd, R.: Not by Genes Alone: How Culture Transformed Human Evolution. University of Chicago Press, Chicago (2008) Google Scholar
  9. 9.
    Al-Saggaf, Y., Williamson, K.: Online communities in saudi arabia: Evaluating the impact on culture through online semi-structured interviews. In: Forum Qualitative Sozialforschung/Forum: Qualitative Social Research, vol. 5 (2004)Google Scholar
  10. 10.
    Solomon, R.C.: The Passions: Emotions And The Meaning of Life. Hackett Publishing, Cambridge (1993) Google Scholar
  11. 11.
    Boehner, K., DePaula, R., Dourish, P., Sengers, P.: How emotion is made and measured. Int. J. Hum.-Comput. Stud. 65(4), 275–291 (2007)CrossRefGoogle Scholar
  12. 12.
    Cacioppo, J.T., Gardner, W.L., Berntson, G.G.: The affect system has parallel and integrative processing components: form follows function. J. Pers. Soc. Psychol. 76(5), 839 (1999)CrossRefGoogle Scholar
  13. 13.
    Jaimes, A., Sebe, N.: Multimodal human-computer interaction: a survey. Comput. Vis. Image Underst. 108(1), 116–134 (2007)CrossRefGoogle Scholar
  14. 14.
    Dalgleish, T., Power, M.J.: Handbook of cognition and emotion. Wiley Online Library (1999)Google Scholar
  15. 15.
    Brave, S., Nass, C.: Emotion in human-computer interaction. In: The Human-computer Interaction Handbook: Fundamentals, Evolving Technologies And Emerging Applications, pp. 81–96 (2002)Google Scholar
  16. 16.
    Ekman, P., Levenson, R.W., Friesen, W.V.: Autonomic nervous system activity distinguishes among emotions. Science 221(4616), 1208–1210 (1983)CrossRefGoogle Scholar
  17. 17.
    Ax, A.F.: The physiological differentiation between fear and anger in humans. Psychosom. Med. 15(5), 433–442 (1953)CrossRefGoogle Scholar
  18. 18.
    Kim, K.H., Bang, S., Kim, S.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng.Comput. 42(3), 419–427 (2004)CrossRefGoogle Scholar
  19. 19.
    Mandryk, R.L., Atkins, M.S.: A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int. J. Hum.-Comput. Stud. 65(4), 329–347 (2007)CrossRefGoogle Scholar
  20. 20.
    Frijda, N.H.: The Emotions. Cambridge University Press, Cambridge (1986) Google Scholar
  21. 21.
    Henriques, R., Paiva, A., Antunes, C.: On the need of new methods to mine electrodermal activity in emotion-centered studies. In: Cao, L., Zeng, Y., Symeonidis, A.L., Gorodetsky, V.I., Yu, P.S., Singh, M.P. (eds.) ADMI. LNCS, vol. 7607, pp. 203–215. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  22. 22.
    Drachen, A., Nacke, L.E., Yannakakis, G., Pedersen, A.L.: Correlation between heart rate, electrodermal activity and player experience in first-person shooter games. In: Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games, pp. 49–54. ACM (2010)Google Scholar
  23. 23.
    Boucsein, W.: Electrodermal Activity. Springer, New York (2012) CrossRefGoogle Scholar
  24. 24.
    Hagemann, D., Naumann, E., Maier, S., Becker, G., Lürken, A., Bartussek, D.: The assessment of affective reactivity using films: validity, reliability and sex differences. Personality Individ. Differ. 26(4), 627–639 (1999)CrossRefGoogle Scholar
  25. 25.
    Sato, W., Noguchi, M., Yoshikawa, S.: Emotion elicitation effect of films in a Japanese sample. Soc. Behav. Pers. Int. J. 35(7), 863–874 (2007)CrossRefGoogle Scholar
  26. 26.
    Likert, R.: A technique for the measurement of attitudes. Archiv. Psychol. 22(140), 1–55 (1932)Google Scholar
  27. 27.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 27:1–27:27 (2011). CrossRefGoogle Scholar
  28. 28.
    Raskin, V.: Semantic Mechanisms of Humor, vol. 24. Springer, Netherlands (1985) Google Scholar
  29. 29.
    Alghowinem, S., Alghuwinem, S., Alshehri, M., Al-Wabil, A., Goecke, R., Wagner, M.: Design of an emotion elicitation framework for arabic speakers. In: Kurosu, M. (ed.) HCI 2014, Part II. LNCS, vol. 8511, pp. 717–728. Springer, Heidelberg (2014) Google Scholar
  30. 30.
    Haag, A., Goronzy, S., Schaich, P., Williams, J.: Emotion recognition using bio-sensors: first steps towards an automatic system. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 36–48. Springer, Heidelberg (2004) CrossRefGoogle Scholar
  31. 31.
    Cahn, J.E.: The generation of a ect in synthesized speech. J. Am. Voice I/O Soc. 8, 1–19 (1990)Google Scholar
  32. 32.
    Cohen, I., Sebe, N., Garg, A., Chen, L.S., Huang, T.S.: Facial expression recognition from video sequences: temporal and static modeling. Comput. Vis. Image Underst. 91(1), 160–187 (2003)CrossRefGoogle Scholar
  33. 33.
    Soyel, H., Demirel, H.O.: Facial expression recognition using 3D facial feature distances. In: Kamel, M.S., Campilho, A. (eds.) ICIAR 2007. LNCS, vol. 4633, pp. 831–838. Springer, Heidelberg (2007) CrossRefGoogle Scholar
  34. 34.
    Tato, R., Santos, R., Kompe, R., Pardo, J.M.: Emotional space improves emotion recognition. In: INTERSPEECH (2002)Google Scholar
  35. 35.
    Yacoub, S.M., Simske, S.J., Lin, X., Burns, J.: Recognition of emotions in interactive voice response systems. In: INTERSPEECH (2003)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Nawal Al-Mutairi
    • 1
    Email author
  • Sharifa Alghowinem
    • 2
    • 3
  • Areej Al-Wabil
    • 1
  1. 1.King Saud UniversityCollege of Computer and Information SciencesRiyadhSaudi Arabia
  2. 2.Australian National UniversityResearch School of Computer ScienceCanberraAustralia
  3. 3.Ministry of Education, Kingdom of Saudi ArabiaRiyadhSaudi Arabia

Personalised recommendations