Neuromorphic Neural Network for Multimodal Brain Image Segmentation and Overall Survival Analysis

  • Woo-Sup Han
  • Il Song HanEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11384)


Image analysis of brain tumors is one of key elements for clinical decision, while manual segmentation is time consuming and known to be subjective to clinicians or radiologists. In this paper, we examined the neuromorphic convolutional neural network on this task of multimodal images, using a down-up resizing network structure. The controlled rectifier neuron function was incorporated in neuromorphic neural network, for introducing the efficiency of segmentation and saliency map generation used in noisy image processing of X-ray CT data and dark road video data. The neuromorphic neural network is proposed to the brain imaging analytic, based on the visual cortex-inspired deep neural network developed for 3 dimensional tooth segmentation and robust visual object detection. Experiment results illustrated the effectiveness and feasibility of our proposed method with flexible requirements of clinical diagnostic decision data, from segmentation to overall survival analysis. The survival prediction was 71% accuracy for the data with true result and 50.6% accuracy of predicting survival days for the individual challenge data without any clinical diagnostic data.


Convolutional neural network Neuromorphic processing Brain tumor Image segmentation Survival analysis Visual cortex 



Authors appreciate the comments of reviewers for their advice and constructive feedback to our article for the improvement.


  1. 1.
    Menze, B., et al.: The multimodal brain tumor image segmentation benchmark (BRATS). IEEE Trans. Med. Imaging 34(10), 1993–2024 (2015)CrossRefGoogle Scholar
  2. 2.
    Bakas, S., et al.: Advancing The Cancer Genome Atlas glioma MRI collections with expert segmentation labels and radiomic features. Nat. Sci. Data 4, 170117 (2017)CrossRefGoogle Scholar
  3. 3.
    Bakas, S., et al.: Segmentation labels and radiomic features for the pre-operative scans of the TCGA-GBM collection. The Cancer Imaging Archive. (2017)
  4. 4.
    Bakas, S., et al: Segmentation labels and radiomic features for the pre-operative scans of the TCGA-LGG collection. The Cancer Imaging Archive. (2017)
  5. 5.
    Bakas, S., Reyes, M., et al.: Identifying the best machine learning algorithms for brain tumor segmentation, progression assessment, and overall survival prediction in the BRATS challenge, arXiv preprint: arXiv:1811.02629 (2018)
  6. 6.
    White, N., Reid, F., Harris, A., Harries, P., Stone, P.: A systematic review of predictions of survival in palliative care: how accurate are clinicians and who are the experts? PLOS One (2016). Scholar
  7. 7.
    Cheon, S., et al.: The accuracy of clinicians’ predictions of survival in advanced cancer: a review. Ann. Palliat. Med. 5(1), 22–29 (2016).
  8. 8.
    Kondziolka, D., et al.: The accuracy of predicting survival in individual patients with cancer. AANS J. Neurosurg. 120, 24–30 (2014)CrossRefGoogle Scholar
  9. 9.
    Zacharaki, E., Morita, N., Bhatt, P., O’Rourke, D., Melhem, E., Davatzikos, C.: Survival analysis of patients with high-grade gliomas based on data mining of image variables. AJNR Am. J. Neuroradiol. 33(6), 1065–1071 (2012)CrossRefGoogle Scholar
  10. 10.
    Cao, S., et al.: 3D U-Net for multimodal brain tumor segmentation. In: 2017 International MICCAI BraTS Challenge Proceedings, pp. 30–33, Quebec City (2017)Google Scholar
  11. 11.
    Isensee, F., Kickingereder, P., Wick, W., Bendszus, M., Maier-Hein, K.H.: Brain tumor segmentation and radiomics survival prediction: contribution to the BRATS 2017 challenge. In: Crimi, A., Bakas, S., Kuijf, H., Menze, B., Reyes, M. (eds.) BrainLes 2017. LNCS, vol. 10670, pp. 287–297. Springer, Cham (2018). Scholar
  12. 12.
    Han, W., Han, I.: Object segmentation for vehicle video and dental CBCT by neuromorphic convolutional recurrent neural network. In: Bi, Y., Kapoor, S., Bhatia, R. (eds.) Intelligent Systems and Application, pp. 264–284. Springer, Cham (2018). ISBN 978-3-319-69266-1CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.ODIGA LtdLondonUK

Personalised recommendations