Advertisement

Breast Cancer Diagnostic Tool Using Deep Feedforward Neural Network and Mother Tree Optimization

  • Wael Korani
  • Malek MouhoubEmail author
Conference paper
  • 59 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1173)

Abstract

Automatic diagnostic tools have been extensively implemented in medical diagnosis processes of different diseases. In this regard, breast cancer diagnosis is particularly important as it becomes one of the most dangerous diseases for women. Consequently, regular and preemptive screening for breast cancer could help initiate treatment earlier and more effectively. In this regard, hospitals and clinics are in need to a robust diagnostic tool that could provide reliable results. The accuracy of diagnostic tools is an important factor that should be taken into consideration when designing a new system. This has motivated us to develop an automatic diagnostic system combining two methodologies: Deep Feedforward Neural Networks (DFNNs) and swarm intelligence algorithms. Swarm intelligence techniques are based on Particle Swarm Optimization (PSO) as well as the Mother Tree Optimization (MTO) algorithm we proposed in the past. In order to asses the performance, in terms of accuracy, of the proposed system, we have conducted several experiments using the Wisconsin Breast Cancer Dataset (WBCD). The results show that the DFNN combined with a variant of our MTO attains a high classification performance, reaching 100% precision.

Keywords

Neural network Nature-inspired techniques Classification Breast cancer diagnosis 

References

  1. 1.
    Brenner, H., Rothenbacher, D., Arndt, V.: Epidemiology of stomach cancer. In: Verma, M. (ed.) Cancer Epidemiology, pp. 467–477. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-1-60327-492-0_23CrossRefGoogle Scholar
  2. 2.
    Parkin, D.M., Bray, F., Ferlay, J., Pisani, P.: Estimating the world cancer burden: Globocan 2000. Int. J. Cancer 94(2), 153–156 (2001)CrossRefGoogle Scholar
  3. 3.
    Rangayyan, R.M., El-Faramawy, N.M., Desautels, J.L., Alim, O.A.: Measures of acutance and shape for classification of breast tumors. IEEE Trans. Med. Imaging 16(6), 799–810 (1997)CrossRefGoogle Scholar
  4. 4.
    Mangasarian, O.L., Wolberg, W.H.: Cancer diagnosis via linear programming. Technical report, University of Wisconsin-Madison Department of Computer Sciences (1990)Google Scholar
  5. 5.
    Quinlan, J.R.: Improved use of continuous attributes in C4.5. J. Artif. Intell. Res. 4, 77–90 (1996)CrossRefGoogle Scholar
  6. 6.
    Hamilton, H.J., Cercone, N., Shan, N.: RIAC: A Rule Induction Algorithm Based on Approximate Classification. Princeton, Citeseer (1996)Google Scholar
  7. 7.
    Salama, G.I., Abdelhalim, M., Zeid, M.A.: Breast cancer diagnosis on three different datasets using multi-classifiers. Breast Cancer (WDBC) 32(569), 2 (2012)Google Scholar
  8. 8.
    Polat, K., Güneş, S.: Breast cancer diagnosis using least square support vector machine. Digit. Signal Proc. 17(4), 694–701 (2007)CrossRefGoogle Scholar
  9. 9.
    Nauck, D., Kruse, R.: Obtaining interpretable fuzzy classification rules from medical data. Artif. Intell. Med. 16(2), 149–169 (1999)CrossRefGoogle Scholar
  10. 10.
    Pena-Reyes, C.A., Sipper, M.: A fuzzy-genetic approach to breast cancer diagnosis. Artif. Intell. Med. 17(2), 131–155 (1999)CrossRefGoogle Scholar
  11. 11.
    Abonyi, J., Szeifert, F.: Supervised fuzzy clustering for the identification of fuzzy classifiers. Pattern Recogn. Lett. 24(14), 2195–2207 (2003)CrossRefGoogle Scholar
  12. 12.
    Paulin, F., Santhakumaran, A.: Classification of breast cancer by comparing back propagation training algorithms. Int. J. Comput. Sci. Eng. 3(1), 327–332 (2011)Google Scholar
  13. 13.
    Nahato, K.B., Harichandran, K.N., Arputharaj, K.: Knowledge mining from clinical datasets using rough sets and backpropagation neural network. Comput. Math. Methods Med. 2015, 13 (2015)CrossRefGoogle Scholar
  14. 14.
    Abdel-Zaher, A.M., Eldeib, A.M.: Breast cancer classification using deep belief networks. Expert Syst. Appl. 46, 139–144 (2016)CrossRefGoogle Scholar
  15. 15.
    Werbos, P.J.: The Roots of Backpropagation: from Ordered Derivatives to Neural Networks and Political Forecasting, vol. 1. Wiley, Hoboken (1994)Google Scholar
  16. 16.
    Hush, D.R., Horne, B.G.: Progress in supervised neural networks. IEEE Signal Process. Mag. 10(1), 8–39 (1993)CrossRefGoogle Scholar
  17. 17.
    Ozturk, C., Karaboga, D.: Hybrid artificial bee colony algorithm for neural network training. In: 2011 IEEE Congress of Evolutionary Computation (CEC), pp. 84–88. IEEE (2011)Google Scholar
  18. 18.
    Karaboga, D., Akay, B., Ozturk, C.: Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: Torra, V., Narukawa, Y., Yoshida, Y. (eds.) MDAI 2007. LNCS (LNAI), vol. 4617, pp. 318–329. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-73729-2_30CrossRefGoogle Scholar
  19. 19.
    Korani, W., Mouhoub, M., Spirty, R.: Mother tree optimization. In: Proceedings of the 2019 IEEE International Conference on Systems, Man, and Cybernetics (IEEE SMC 2019), pp. 2206–2213. IEEE (2019)Google Scholar
  20. 20.
    Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, MHS 1995, pp. 39–43. IEEE (1995)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.University of SaskatchewanSaskatoonCanada
  2. 2.University of ReginaReginaCanada

Personalised recommendations