An Ensemble Neural Network for Multi-label Classification of Electrocardiogram

  • Dongya JiaEmail author
  • Wei Zhao
  • Zhenqi Li
  • Cong Yan
  • Hongmei Wang
  • Jing Hu
  • Jiansheng Fang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11794)


An electrocardiogram (ECG) record potentially contains multiple abnormalities concurrently, therefore multi-label classification of ECG is significant in clinical scenarios. In this paper, we propose an ensemble neural network to address the multi-label classification of 12-lead ECG. The proposed network contains two modules, which treat the multi-label task from two different perspectives. The first module deals with the task in a sequence-generation manner by a novel encoder-decoder structure. The second module treats the multi-label problem as multiple binary classification tasks, by employing two convolutional neural networks of different structure. Finally, the predictions of two modules are integrated as the final result. Our method is trained and evaluated on the dataset provided by the First China ECG Intelligent Competition, and yields a Macro-\(F_1\) of 0.872 on the test set.


Deep learning ECG Multi-label Classification 


  1. 1.
    Martinez, J.P., Almeida, R., Olmos, S., Rocha, A.P., Laguna, P.: A wavelet-based ECG delineator: evaluation on standard databases. IEEE Trans. Biomed. Eng. 51(4), 570–581 (2004)CrossRefGoogle Scholar
  2. 2.
    Gao, P., Zhao, J., Wang, G., Guo, H.: Real time ECG characteristic point detection with randomly selected signal pair difference (RSSPD) feature and random forest classifier. In: 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Orlando, USA, pp. 732–735. IEEE (2016)Google Scholar
  3. 3.
    Xia, Z., et al.: Real-time ECG delineation with randomly selected wavelet transform feature and random walk estimation. In: 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Hawaii, USA, pp. 2691–2694. IEEE (2018)Google Scholar
  4. 4.
    Vijayavanan, M., Rathikarani, V., Dhanalakshmi, P.: Automatic classification of ECG signal for heart disease diagnosis using morphological features. Int. J. Comput. Sci. Eng. Technol. (IJCSET) 5(4), 449–555 (2014) Google Scholar
  5. 5.
    Korurek, M., Dogan, B.: ECG beat classification using particle swarm optimization and radial basis function neural network. Expert Syst. Appl. 37(12), 7563–7569 (2010)CrossRefGoogle Scholar
  6. 6.
    Park, K.S., et al.: Hierarchical support vector machine based heartbeat classification using higher order statistics and hermite basis function. In: Computers in Cardiology, Bologna, Italy, pp. 229–232. IEEE (2008)Google Scholar
  7. 7.
    Hannun, A.Y., et al.: Cardiologist-level arrhythmia detection and classification in ambulatory electrocardiograms using a deep neural network. Nat. Med. 25, 65–69 (2019)CrossRefGoogle Scholar
  8. 8.
    Acharya, U.R., Fujita, H., Oh, S.L., Hagiwara, Y., Tan, J.H., Adam, M.: Application of deep convolutional neural network for automated detection of myocardial infraction using ECG signals. Inf. Sci. 415–416, 190–198 (2017)CrossRefGoogle Scholar
  9. 9.
    The First China ECG Intelligent Competition. Accessed 28 June 2019
  10. 10.
    Yang, P.C., Sun, X., Li, W., Ma, S.M., Wu, W., Wang, H.F.: SGM: sequence generation model for multi-label classification. In: Proceedings of the 27th International Conference on Computational Linguistics, pp. 3915–3926. Association for Computational Linguistics, New Mexico, USA (2018)Google Scholar
  11. 11.
    Huang, G., Liu, Z., Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, Hl, USA, pp. 2261–2269. IEEE (2017)Google Scholar
  12. 12.
    Vaswani, A., et al.: Attention is all you need. In: 31st Conference on Neural Information Processing Systems (NIPS), Long Beach, CA, USA, pp. 2261–2269 (2017)Google Scholar
  13. 13.
    Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, pp. 1412–1421. Association for Computational Linguistics (2015)Google Scholar
  14. 14.
    Wiseman, S., Rush, A.M.: Sequence-to-sequence learning as beam-search optimization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Austin, Texas, USA, pp. 1296–1306. Association for Computational Linguistics (2016)Google Scholar
  15. 15.
    He, K.M., Zhang, X.Y., Ren, S.Q., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, pp. 770–779. IEEE (2016)Google Scholar
  16. 16.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: 2015 International Conference on Learning Representations (ICLR), San Diego, USA, pp. 770–779 (2015)Google Scholar
  17. 17.
    Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Dongya Jia
    • 1
    Email author
  • Wei Zhao
    • 1
  • Zhenqi Li
    • 1
  • Cong Yan
    • 1
  • Hongmei Wang
    • 1
  • Jing Hu
    • 1
  • Jiansheng Fang
    • 1
  1. 1.Central ResearchGuangzhou Shiyuan Electronic Technology Company LimitedGuangzhouChina

Personalised recommendations