Advertisement

Exploring Day-to-Day Variability in the Relations Between Emotion and EEG Signals

  • Yuan-Pin LinEmail author
  • Sheng-Hsiou Hsu
  • Tzyy-Ping Jung
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9183)

Abstract

Electroencephalography (EEG)-based emotion classification has drawn increasing attention over the last few years and become an emerging direction in brain-computer interfaces (BCI), namely affective BCI (ABCI). Many prior studies devoted to improve emotion-classification models using the data collected within a single session or day. Less attention has been directed to the day-to-day EEG variability associated with emotional responses. This study recorded EEG signals of 12 subjects, each underwent the music-listening experiment on five different days, to assess the day-to-day variability from the perspectives of inter-day data distributions and cross-day emotion classification. The empirical results of this study demonstrated that the clusters of the same emotion across days tended to scatter wider than the clusters of different emotions within a day. Such inter-day variability poses a severe challenge for building an accurate cross-day emotion-classification model in real-life ABCI applications.

Keywords

EEG-based emotion classification Day-to-day variability 

Notes

Acknowledgement

This work was support in part by Army Research Laboratory under Cooperative Agreement Number W911NF-10-2-0022.

References

  1. 1.
    Mühl, C., Allison, B., Nijholt, A., Chanel, G.: A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain-Comput. Interfaces 1, 66–84 (2014)CrossRefGoogle Scholar
  2. 2.
    Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5, 327–339 (2014)CrossRefzbMATHGoogle Scholar
  3. 3.
    Lin, Y.P., Wang, C.H., Jung, T.P., Wu, T.L., Jeng, S.K., Duann, J.R., Chen, J.H.: EEG-based emotion recognition in music listening. IEEE Trans. Bio-Med. Eng. 57, 1798–1806 (2010)CrossRefGoogle Scholar
  4. 4.
    Koelstra, S., Patras, I.: Fusion of facial expressions and EEG for implicit affective tagging. Image Vis. Comput. 31, 164–174 (2013)CrossRefGoogle Scholar
  5. 5.
    Chanel, G., Kierkels, J.J.M., Soleymani, M., Pun, T.: Short-term emotion assessment in a recall paradigm. In.t J. Hum. Comput. Stud. 67, 607–627 (2009)CrossRefGoogle Scholar
  6. 6.
    Lin, Y.P., Yang, Y.H., Jung, T.P.: Fusion of electroencephalogram dynamics and musical contents for estimating emotional responses in music listening. Front. Neurosci. 8, 94 (2014)Google Scholar
  7. 7.
    Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. 23, 1175–1191 (2001)CrossRefGoogle Scholar
  8. 8.
    Lin, Y.P., Jung, T.P.: Exploring day-to-day variability in EEG-based emotion classification. In: IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 2226–2229 (2014)Google Scholar
  9. 9.
    Eerola, T., Vuoskoski, J.K.: A comparison of the discrete and dimensional models of emotion in music. Psychol. Music 39, 18–49 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Swartz Center for Computational Neuroscience, Institute for Neural ComputationUniversity of CaliforniaSan DiegoUSA

Personalised recommendations