An Affective BCI Using Multiple ERP Components Associated to Facial Emotion Processing

  • Qibin Zhao
  • Yu Zhang
  • Akinari Onishi
  • Andrzej Cichocki
Chapter
Part of the SpringerBriefs in Electrical and Computer Engineering book series (BRIEFSELECTRIC)

Abstract

P300-based brain computer interfaces (BCIs) have successfully demonstrated that attention to an oddball stimulus can enhance the P300 component of the event-related potential (ERP) phase-locked to the event. However, it was unclear whether the more sophisticated face-evoked potentials can also be modulated by related mental tasks under the oddball paradigm. This study investigated ERP responses to image stimuli of objects, neutral faces, and emotional faces when subjects perform attention, face recognition and discrimination of emotional facial expressions respectively under the oddball paradigm. The results revealed the significant difference between target and non-target ERPs for each mental task. In addition, significant differences among the three mental tasks were observed for vertex-positive potential (VPP) over the fronto-central sites, late positive potential (LPP)/P3b over the centro-parietal sites and N250 over the occipito-temporal sites. These findings indicate that a novel affective BCI paradigm can be developed based on detection of multiple ERP components reflecting human face encoding and emotion processing. The high classification performance for single-trial emotional face-related ERPs demonstrated the effectiveness of the affective BCI.

References

  1. L. Acqualagnav, M. Treder, M. Schreuder, B. Blankertz, A novel brain-computer interface based on the rapid serial visual presentation paradigm, in Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE (IEEE, 2010), pp. 2686–2689Google Scholar
  2. T. Dennis, C. Chen, Emotional face processing and attention performance in three domains: neurophysiological mechanisms and moderating effects of trait anxiety. Int. J. Psychophysiol. 65(1), 10–19 (2007)CrossRefGoogle Scholar
  3. M. Eimer, Effects of face inversion on the structural encoding and recognition of faces: evidence from event-related brain potentials. Cogn. Brain Res. 10(1–2), 145–158 (2000)CrossRefGoogle Scholar
  4. L. Farwell, E. Donchin, Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr. Clin. Neurophysiol. 70(6), 510–523 (1988)CrossRefGoogle Scholar
  5. A. Furdea, S. Halder, D. Krusienski, D. Bross, F. Nijboer, N. Birbaumer, A. Kübler, An auditory oddball (P300) spelling system for brain-computer interfaces. Psychophysiology 46(3), 617–625 (2009)CrossRefGoogle Scholar
  6. F. Guo, B. Hong, X. Gao, S. Gao, A brain-computer interface using motion-onset visual evoked potential. J. Neural Eng. 5(4), 477–485 (2008)CrossRefGoogle Scholar
  7. R. Itier, M. Taylor, Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: a repetition study using ERPs. NeuroImage 15(2), 353–372 (2002)CrossRefGoogle Scholar
  8. R. Itier, M. Taylor, N170 or N1? spatiotemporal differences between object and face processing using ERPs. Cereb. Cortex 14(2), 132–142 (2004)CrossRefGoogle Scholar
  9. J. Jin, B. Allison, X. Wang, C. Neuper, A combined brain-computer interface based on P300 potentials and motion-onset visual evoked potentials. J. Neurosci. Methods 205(2), 265–276 (2012)CrossRefGoogle Scholar
  10. T. Kaufmann, S. Schulz, C. Grünzinger, A. Kübler, Flashing characters with famous faces improves ERP-based brain-computer interface performance. J. Neural Eng. 8(5), 056016 (2011)CrossRefGoogle Scholar
  11. K. Lee, T. Lee, S. Yoon, Y. Cho, J. Choi, H. Kim, Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli. Brain Res. 1337, 56–63 (2010)CrossRefGoogle Scholar
  12. A. Lenhardt, M. Kaper, H. Ritter, An adaptive P300-based online brain-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 16(2), 121–130 (2008)CrossRefGoogle Scholar
  13. W. Luo, W. Feng, W. He, N. Wang, Y. Luo, Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage 49(2), 1857–1867 (2010)CrossRefGoogle Scholar
  14. S. Makeig, G. Leslie, T. Mullen, D. Sarma, N. Bigdely-Shamlo, C. Kothe, First demonstration of a musical emotion BCI, in Affective Computing and Intelligent Interaction, 2011, pp. 487–496Google Scholar
  15. S. Martens, N. Hill, J. Farquhar, B. Schölkopf, Overlap and refractory effects in a brain-computer interface speller based on the visual P300 event-related potential. J. Neural Eng. 6, 026003 (2009)CrossRefGoogle Scholar
  16. T. Marzi, M. Viggiano, Interplay between familiarity and orientation in face processing: an ERP study. Int. J. Psychophysiol. 65(3), 182–192 (2007)CrossRefGoogle Scholar
  17. M. Moscovitch, G. Winocur, M. Behrmann, What is special about face recognition? nineteen experiments on a person with visual object agnosia and dyslexia but normal face recognition. J. Cogn. Neurosci. 9(5), 555–604 (1997)CrossRefGoogle Scholar
  18. S. Nasr, H. Esteky, A study of N250 event-related brain potential during face and non-face detection tasks. J. Vis. 9(5), 5 (2009)CrossRefGoogle Scholar
  19. M.F. Neumann, T.N. Mohamed, S.R. Schweinberger, Face and object encoding under perceptual load: ERP evidence. NeuroImage 54(4), 3021–3027 (2011)CrossRefGoogle Scholar
  20. A. Nijholt, B. Allison, S. Dunne, D. Heylen et al., Affective brain-computer interfaces (abci 2011), in Proceedings of the 4th International Conference on Affective Computing and Intelligent Interaction-Volume Part II (Springer-Verlag (2011), pp. 435–435Google Scholar
  21. A. Onishi, Y. Zhang, Q. Zhao, A. Cichocki, Fast and reliable P300-based BCI with facial images, in Proceedings of the 5th International Brain-Computer Interface Conference, 2011, pp. 192–195Google Scholar
  22. B. Rossion, J. Delvenne, D. Debatisse, V. Goffaux, R. Bruyer, M. Crommelinck, J. Guérit, Spatio-temporal localization of the face inversion effect: an event-related potentials study. Biol. psychol. 50(3), 173–189 (1999)CrossRefGoogle Scholar
  23. B. Sadeh, A. Zhdanov, I. Podlipsky, T. Hendler, G. Yovel, The validity of the face-selective ERP N170 component during simultaneous recording with functional MRI. Neuroimage 42(2), 778–786 (2008)CrossRefGoogle Scholar
  24. S. Schweinberger, V. Huddy, A. Burton, N250r: a face-selective brain response to stimulus repetitions. NeuroReport 15(9), 1501 (2004)CrossRefGoogle Scholar
  25. M. Thulasidas, C. Guan, J. Wu, Robust classification of EEG signal for brain-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 14(1), 24–29 (2006)CrossRefGoogle Scholar
  26. G. Townsend, B. LaPallo, C. Boulay, D. Krusienski, G. Frye, C. Hauser, N. Schwartz, T. Vaughan, J. Wolpaw, E. Sellers, A novel P300-based brain-computer interface stimulus presentation paradigm: moving beyond rows and columns. Clin. Neurophysiol. 121(7), 1109–1120 (2010)CrossRefGoogle Scholar
  27. M. Treder, B. Blankertz, (C) overt attention and visual speller design in an ERP-based brain-computer interface. Behav. Brain Funct. 6(1), 28 (2010)CrossRefGoogle Scholar
  28. M. Willis, R. Palermo, D. Burke, C. Atkinson, G. McArthur, Switching associations between facial identity and emotional expression: a behavioural and ERP study. Neuroimage 50(1), 329–339 (2010)CrossRefGoogle Scholar
  29. J. Wolpaw, N. Birbaumer, D. McFarland, G. Pfurtscheller, T. Vaughan, Brain-computer interfaces for communication and control. Clin. Neurophysiol. 113(6), 767–791 (2002)Google Scholar
  30. P. Xu, P. Yang, X. Lei, D. Yao, An enhanced probabilistic LDA for multi-class brain computer interface. PloS One 6(1), e14634 (2011)CrossRefGoogle Scholar
  31. Y. Zhang, Q. Zhao, J. Jin, X. Wang, A. Cichocki, A novel BCI based on ERP components sensitive to configural processing of human faces. J. Neural Eng. 9(2), 026018 (2012)CrossRefGoogle Scholar
  32. Q. Zhao, A. Onishi, Y. Zhang, J. Cao, L. Zhang, A. Cichocki, A novel oddball paradigm for affective BCIs using emotional faces as stimuli. Lecture Notes in Computer Science, vol. 7062, 2011, pp. 279–286Google Scholar

Copyright information

© The Author(s) 2013

Authors and Affiliations

  • Qibin Zhao
    • 1
  • Yu Zhang
    • 1
  • Akinari Onishi
    • 1
  • Andrzej Cichocki
    • 1
  1. 1.Laboratory for Advanced Brain Signal ProcessingBrain Science Institute, RIKENSaitamaJapan

Personalised recommendations