Brain Functional Connectivity Analysis and Crucial Channel Selection Using Channel-Wise CNN

  • Jiaxing Wang
  • Weiqun WangEmail author
  • Zeng-Guang Hou
  • Xu Liang
  • Shixin Ren
  • Liang Peng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11304)


Brain functional connectivity analysis and crucial channel selection, play an important role in brain working principle exploration and EEG-based emotion recognition. Towards this purpose, a novel channel-wise convolution neural network (CWCNN) is proposed, where every group convolution operator is imposed only on a separate channel. The inputs and weights of the full connection layer are visualized by using the brain topographic maps to analyze brain functional connectivity and select the crucial channels. Experiments are carried out on the SJTU emotion EEG database (SEED). The results demonstrate that positive and neutral emotions evoke greater brain activities than negative emotions in the left frontal region, which is consistent with the result from the power spectrum analysis in the literature. Meanwhile, 16 crucial channels, which are mainly distributed in the frontal and temporal regions, are selected based on the proposed method to improve emotion recognition performance. The classification accuracy by using the selected crucial channels is similar to that without channel selection. But the model with the 16 selected channels is more memory-efficient and the computation time can be reduced substantially.


Channel-wise convolution neural network Brain topographic maps Full connection layer Weights Crucial channels 


  1. 1.
    Guo, S., Zhao, X., Wei, W., Guo, J., Zhao, F., Hu, Y.: Feasibility study of a novel rehabilitation training system for upper limb based on emotional control. In: 2015 IEEE International Conference on Mechatronics and Automation (ICMA), pp. 1507–1512 (2015)Google Scholar
  2. 2.
    Sourina, O., Liu, Y., Nguyen, M.K.: Real-time EEG-based emotion recognition for music therapy. J. Multimodal User Interfaces 5(1–2), 27–35 (2012)CrossRefGoogle Scholar
  3. 3.
    Datko, M., Pineda, J.A., Müller, R.A.: Positive effects of neurofeedback on autism symptoms correlate with brain activation during imitation and observation. Eur. J. Neurosci. 47(6), 579–591 (2017)CrossRefGoogle Scholar
  4. 4.
    Hartwig, M., Bond, C.F.: Lie detection from multiple cues: a meta-analysis. Appl. Cogn. Psychol. 28(5), 661–676 (2014)CrossRefGoogle Scholar
  5. 5.
    Jiang, M., Rahmani, A.M., Westerlund, T., Liljeberg, P., Tenhunen, H.: Facial expression recognition with sEMG method. In: 2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, pp. 981–988 (2015)Google Scholar
  6. 6.
    Renjith, S., Manju, K.G.: Speech based emotion recognition in Tamil and Telugu using LPCC and hurst parameters #x2014; a comparitive study using KNN and ANN classifiers. In: 2017 International Conference on Circuit, Power and Computing Technologies (ICCPCT), pp. 1–6 (2017)Google Scholar
  7. 7.
    Wen, W., Liu, G., Cheng, N., Wei, J., Shangguan, P., Huang, W.: Emotion recognition based on multi-variant correlation of physiological signals. IEEE Trans. Affect. Comput. 5(2), 126–140 (2014)CrossRefGoogle Scholar
  8. 8.
    Li, H., Qing, C., Xu, X., Zhang, T.: A novel DE-PCCM feature for EEG-based emotion recognition. In: 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), pp. 389–393 (2017)Google Scholar
  9. 9.
    Matiko, J.W., Beeby, S.P., Tudor, J.: Fuzzy logic based emotion classification. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4389–4393 (2014)Google Scholar
  10. 10.
    Leslie, G., Ojeda, A., Makeig, S.: Towards an affective brain-computer interface monitoring musical engagement. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 871–875 (2013)Google Scholar
  11. 11.
    Tandle, A., Jog, N., Dharmadhikari, A., Jaiswal, S.: Estimation of valence of emotion from musically stimulated EEG using frontal theta asymmetry. In: 2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), pp. 63–68 (2016)Google Scholar
  12. 12.
    Zheng, W.L., Zhu, J.Y., Peng, Y., Lu, B.L.: EEG-based emotion classification using deep belief networks. In: 2014 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6 (2014)Google Scholar
  13. 13.
    Zheng, W.L., Lu, B.L.: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 7(3), 162–175 (2015)CrossRefGoogle Scholar
  14. 14.
    PyTorch deep learning framework. Accessed 22 Aug 2018

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Jiaxing Wang
    • 1
    • 2
  • Weiqun Wang
    • 2
    Email author
  • Zeng-Guang Hou
    • 1
    • 2
    • 3
  • Xu Liang
    • 1
    • 2
  • Shixin Ren
    • 1
    • 2
  • Liang Peng
    • 2
  1. 1.University of Chinese Academy of SciencesBeijingChina
  2. 2.The State Key Laboratory of Management and Control for Complex Systems, Institute of AutomationChinese Academy of SciencesBeijingChina
  3. 3.CAS Center for Excellence in Brain Science and Intelligence TechnologyBeijingChina

Personalised recommendations