Emotion Analysis Through EEG and Peripheral Physiological Signals Using KNN Classifier

  • Shourya ShuklaEmail author
  • Rahul Kumar ChaurasiyaEmail author
Conference paper
Part of the Lecture Notes in Computational Vision and Biomechanics book series (LNCVB, volume 30)


Emotions are the characteristics of human beings which are triggered by the mood, temperament or motivation of an individual. Emotions are nothing but the response to the stimuli that are experienced by the brain. Any changes in one’s emotional state results in changes in electrical signals generated by the brain. The emotions can be explicit or implicit, i.e. either emotion may be expressed or remain unexpressed by the individual. As these emotions are experienced by the individual as the result of the brain stimulus, we can observe Electroencephalogram (EEG) signal to classify the emotions. Some of the physiological signals may also be taken into account as any change in emotional state result in some physiological changes. For the analysis, we have used the standard DEAP dataset for emotion analysis. In the dataset, the 32 test subjects are shown with 40 different 1-minute music videos and the EEG and other physiological signals are recorded. On the basis of the Self-Assessment Manikins (SAM), we classify the emotion state in the valence arousal plane. The K-Nearest Neighbour classifier is used to classify the multi-class emotions as higher/lower levels of the valence arousal plane. The comparison of KNN with other classifiers depicts that KNN has produced best average accuracy of 87.1%.


  1. 1.
    Dasdemir Y, Yildirim E, Yildirim S (2017) Analysis of functional brain connections for positive–negative emotions using phase locking value. Cogn Neurodyn 11(6):487–500CrossRefGoogle Scholar
  2. 2.
    Jahankhani P, Kodogiannis V, Revett K (2006) EEG signal classification using wavelet feature extraction and neural networks. In: International symposium on modern computing, 2006. JVA’06. IEEE John Vincent Atanasoff 2006. IEEE, New York, pp 120–124Google Scholar
  3. 3.
    Başar E, Başar-Eroglu C, Karakaş S, Schürmann M (2001) Gamma, alpha, delta, and theta oscillations govern cognitive processes. Int J Psychophysiol 39(2):241–248CrossRefGoogle Scholar
  4. 4.
    Haas LF (2003) Hans Berger (1873–1941), Richard Caton (1842–1926), and electroencephalography. J Neurol Neurosurg Psychiatry 74(1):9CrossRefGoogle Scholar
  5. 5.
    Hadley JM (1941) Some relationships between electrical signs of central and peripheral activity: II. During ‘mental work’. J Exp Psychol 28(1):53CrossRefGoogle Scholar
  6. 6.
    Plutchik R (1959) The effects of high intensity intermittent sound on performance, feeling and physiology. Psychol Bull 56(2):133CrossRefGoogle Scholar
  7. 7.
    Moon J, Kim Y, Lee H, Bae C, Yoon WC (2013) Extraction of user preference for video stimuli using EEG-based user responses. ETRI J 35(6):1105–1114CrossRefGoogle Scholar
  8. 8.
    Ekman P, Friesen WV, O’sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, Scherer K (1987) Universals and cultural differences in the judgments of facial expressions of emotion. J Pers Soc Psychol 53(4):712CrossRefGoogle Scholar
  9. 9.
    Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31CrossRefGoogle Scholar
  10. 10.
    Torres-Valencia CA, García HF, Holguín GA, Álvarez MA, Orozco Á (2015) Dynamic hand gesture recognition using generalized time warping and deep belief networks. In: International symposium on visual computing. Springer, Cham, pp 682–691CrossRefGoogle Scholar
  11. 11.
    Ramasamy M, Varadan VK (2017) Study of heart-brain interactions through EEG, ECG, and emotions. In: Nanosensors, biosensors, info-tech sensors and 3D systems 2017, vol 10167. International Society for Optics and Photonics, p 101670IGoogle Scholar
  12. 12.
    Li Y, Li X, Ratcliffe M, Liu L, Qi Y, Liu Q (2011) A real-time EEG-based BCI system for attention recognition in ubiquitous environment. In: Proceedings of 2011 international workshop on ubiquitous affective awareness and intelligent interaction. ACM, New York, pp 33–40Google Scholar
  13. 13.
    Hoa LT, Anh ND (2007) Orthogonal-based wavelet analysis of wind turbulence and correlation between turbulence and forces. Vietnam J Mech 29(2):73–82CrossRefGoogle Scholar
  14. 14.
    Lang PJ, Greenwald MK, Bradley MM, Hamm AO (1993) Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology 30(3):261–273CrossRefGoogle Scholar
  15. 15.
    Cygankiewicz I, Wranicz JK, Bolinska H, Zaslonka J, Zareba W (2004) Relationship between heart rate turbulence and heart rate, heart rate variability, and number of ventricular premature beats in coronary patients. J Cardiovasc Electrophysiol 15(7):731–737CrossRefGoogle Scholar
  16. 16.
    Zhang ML, Zhou ZH (2007) ML-KNN: a lazy learning approach to multi-label learning. Pattern Recogn 40(7):2038–2048CrossRefGoogle Scholar
  17. 17.
    Parthasarathy G, Chatterji BN (1990) A class of new KNN methods for low sample problems. IEEE Trans Syst Man Cybern 20(3):715–718CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.National Institute of TechnologyRaipurIndia

Personalised recommendations