Advertisement

Evolving Systems

, Volume 10, Issue 3, pp 381–396 | Cite as

Dendrite ellipsoidal neurons based on k-means optimization

  • Fernando ArceEmail author
  • Erik Zamora
  • Carolina Fócil-Arias
  • Humberto Sossa
Original Paper
  • 95 Downloads

Abstract

Dendrite morphological neurons are a type of artificial neural network that can be used to solve classification problems. The major difference with respect to classical perceptrons is that morphological neurons create hyperboxes to separate patterns from different classes, while perceptrons use hyperplanes. In this paper, we introduce an improved version of dendrite morphological neural networks, which we have called dendrite ellipsoidal neuron that employs hyperellipsoids instead of hyperboxes. This ellipsoidal neuron is presented with a new training algorithm, to set the covariance matrix and the centroid of each hyperellipsoid based on k-means++, by applying hill climbing to search for an optimum number of hyperellipsoids. The main advantage of this approach is that dendrite ellipsoidal neuron creates smoother decision boundaries. The proposed neural model was tested on synthetic and real datasets from the UCI machine learning repository (in a paired t-test) achieving an average accuracy of 80.7%, while multi-layer perceptrons gave 78.4%, support vector machines obtained 74.2%, and radial basis networks 72.7%. Lastly, to test the proposed method performance in solving real practical problems, our model was used to detect lane lines on an urban highway, for classifying figures with a Nao robot and for traffic detection.

Keywords

Morphological neurons Dendrite morphological neural networks k-means++ Ellipsoidal neuron 

Notes

Acknowledgements

E. Zamora and H. Sossa would like to acknowledge the support provided by UPIITA-IPN and CIC-IPN in carrying out this research. This work was economically supported by SIP-IPN (grant numbers 20170836 and 20170693), and CONACYT grant number 65 (Frontiers of Science). F. Arce and C. Fócil-Arias acknowledge CONACYT for the scholarship granted towards pursuing their PhD studies. All the authors thank to the students exchange program Delfín and to the exchange students for implementing some DEN applications.

References

  1. Arce F, Zamora E, Sossa H, Barrón R (2018) Differential evolution training algorithm for dendrite morphological neural networks. Appl Soft Comput 68:303–313.  https://doi.org/10.1016/j.asoc.2018.03.033 CrossRefGoogle Scholar
  2. Arce F, Zamora E, Barrón R, Sossa H (2016) Dendrite morphological neurons trained by differential evolution. In: Computational intelligence, 2016 ieee symposium series onGoogle Scholar
  3. Arce F, Zamora E, Sossa H (2017) Dendrite ellipsoidal neuron. In: 2017 international joint conference on neural networks (IJCNN), pp 795–802.  https://doi.org/10.1109/IJCNN.2017.7965933
  4. Dheeru D, Karra Taniskidou E (2017) Asuncion. UCI machine learning repository, University of California, Irvine, School of Information and Computer Sciences. http://archive.ics.uci.edu/ml
  5. Babiloni F, Bianchi L, Semeraro F, Millán J, Mouriño J, Cattini A, Salinari S, Marciani M, Cincotti F (2001) Mahalanobis distance-based classifiers are able to recognize EEG patterns by using few EEG electrodes. Neural Netw 1:651–654Google Scholar
  6. Barmpoutis A, Ritter GX (2006) Orthonormal basis lattice neural networks. In: Fuzzy systems, 2006 IEEE international conference on, pp 331–336.  https://doi.org/10.1109/FUZZY.2006.1681733
  7. Bishop CM (2006) Pattern recognition and machine learning (information science and statistics). Springer, New YorkzbMATHGoogle Scholar
  8. Burnham KP, Anderson DR (eds) (2002) Information and likelihood theory: a basis for model selection and inference. Springer, New York, pp 49–97.  https://doi.org/10.1007/978-0-387-22456-5_2 Google Scholar
  9. Cerioli A (2005) K-means cluster analysis and mahalanobis metrics: a problematic match or an overlooked opportunity? Stat Appl 17(1):61–73Google Scholar
  10. Davidson JL, Hummer F (1993) Morphology neural networks: an introduction with applications. Circ Syst Signal Process 12(2):177–210.  https://doi.org/10.1007/BF01189873 MathSciNetCrossRefzbMATHGoogle Scholar
  11. Davidson JL, Ritter GX (1990) Theory of morphological neural networks. Digit Opt Comput 10(1117/12):18085.  https://doi.org/10.1117/12.18085 Google Scholar
  12. Davidson JL, Sun K (1991) Template learning in morphological neural nets. Int Soc Opt Photon 10(1117/12):46114Google Scholar
  13. de Araujo RA (2012) A morphological perceptron with gradient-based learning for brazilian stock market forecasting. Neural Netw 28:61–81.  https://doi.org/10.1016/j.neunet.2011.12.004 CrossRefGoogle Scholar
  14. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395.  https://doi.org/10.1145/358669.358692 MathSciNetCrossRefGoogle Scholar
  15. Goodfellow I, Bengio Y, Courville A (2016) Deep learning. MIT, Cambridge. http://www.deeplearningbook.org. Accessed 20 Aug 2017
  16. Hernández G, Zamora E, Sossa H (2017) Comparing deep and dendrite neural networks: a case study. Springer, Cham, pp 32–41.  https://doi.org/10.1007/978-3-319-59226-8_4 Google Scholar
  17. Leshno M, Lin VY, Pinkus A, Schocken S (1993) Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw 6(6):861–867CrossRefGoogle Scholar
  18. Mahalanobis PC (1936) On the generalised distance in statistics. Proc Natl Inst Sci India 2(1):49–55MathSciNetzbMATHGoogle Scholar
  19. McLachlan GJ, Basford KE (1988) Mixture models. Inference and applications to clustering, Marcel Dekker Inc, New York, BaselzbMATHGoogle Scholar
  20. Melnykov I, Melnykov V (2014) On -means algorithm with the use of mahalanobis distances. Stat Prob Lett 84:88–95.  https://doi.org/10.1016/j.spl.2013.09.026 MathSciNetCrossRefzbMATHGoogle Scholar
  21. Penrose R (1955) A generalized inverse for matrices. Math Proc Cam Philos Soc 51(3):406–413.  https://doi.org/10.1017/S0305004100030401 CrossRefzbMATHGoogle Scholar
  22. Pessoa LF, Maragos P (2000) Neural networks with hybrid morphological/rank/linear nodes: a unifying framework with applications to handwritten character recognition. Pattern Recognit 33(6):945–960.  https://doi.org/10.1016/S0031-3203(99)00157-0 CrossRefGoogle Scholar
  23. Reynolds DA (2015) Gaussian mixture models. In: Encyclopedia of biometrics, 2nd edn. Springer, MA, pp 827–832.  https://doi.org/10.1007/978-1-4899-7488-4_196
  24. Ritter GX, Iancu L, Urcid G (2003) Morphological perceptrons with dendritic structure. In: Fuzzy systems, 2003. FUZZ ’03. The 12th IEEE international conference on, vol 2, pp 1296–1301 vol 2.  https://doi.org/10.1109/FUZZ.2003.1206618
  25. Ritter GX, Schmalz MS (2006) Learning in lattice neural networks that employ dendritic computing. In: Fuzzy systems, 2006 IEEE international conference on, pp 7–13.  https://doi.org/10.1109/FUZZY.2006.1681687
  26. Ritter GX, Sussner P (1996) An introduction to morphological neural networks. In: Pattern recognition, 1996, Proceedings of the 13th international conference on, vol 4, pp 709–717.  https://doi.org/10.1109/ICPR.1996.547657
  27. Ritter GX, Urcid G, Juan-Carlos VN (2014) Two lattice metrics dendritic computing for pattern recognition. In: Fuzzy systems (FUZZ-IEEE), 2014 IEEE international conference on, pp 45–52.  https://doi.org/10.1109/FUZZ-IEEE.2014.6891551
  28. Ritter GX, Urcid G (2003) Lattice algebra approach to single-neuron computation. IEEE Trans Neural Netw 14(2):282–295.  https://doi.org/10.1109/TNN.2003.809427 CrossRefGoogle Scholar
  29. Ritter GX, Urcid G (2007) Learning in lattice neural networks that employ dendritic computing. Springer, Berlin, pp 25–44.  https://doi.org/10.1007/978-3-540-72687-6_2 Google Scholar
  30. Ritter GX, Li D, Wilson JN (1989) Image algebra and its relationship to neural networks. Int Soc Opt Photon 10(1117/12):960428.  https://doi.org/10.1117/12.960428 Google Scholar
  31. Sossa H, Guevara E (2013) Modified dendrite morphological neural network applied to 3D object recognition, LNCS 7914. Springer, Berlin, pp 314–324.  https://doi.org/10.1007/978-3-642-38989-4_32 Google Scholar
  32. Sossa H, Guevara E (2014) Efficient training for dendrite morphological neural networks. Neurocomputing 131:132–142.  https://doi.org/10.1016/j.neucom.2013.10.031 CrossRefGoogle Scholar
  33. Sossa H, Cortés G, Guevara E (2014) New radial basis function neural network architecture for pattern classification: first results. Springer, Cham, pp 706–713.  https://doi.org/10.1007/978-3-319-12568-8_86 Google Scholar
  34. Sung KK, Poggio T (1995) Learning human face detection in cluttered scenes. Springer, Heidelberg, pp 432–439.  https://doi.org/10.1007/3-540-60268-2_326 Google Scholar
  35. Sung KK, Poggio T (1998) Example-based learning for view-based human face detection. IEEE Trans Pattern Anal Mach Intell 20(1):39–51.  https://doi.org/10.1109/34.655648 CrossRefGoogle Scholar
  36. Sussner P (1998) Morphological perceptron learning. In: Intelligent control (ISIC), 1998. Held jointly with IEEE international symposium on computational intelligence in robotics and automation (CIRA), intelligent systems and semiotics (ISAS), Proceedings, pp 477–482,  https://doi.org/10.1109/ISIC.1998.713708
  37. Sussner P, Esmi EL (2009a) Constructive morphological neural networks: some theoretical aspects and experimental results in classification. Springer, Berlin, pp 123–144.  https://doi.org/10.1007/978-3-642-04512-7_7 CrossRefGoogle Scholar
  38. Sussner P, Esmi EL (2011) Morphological perceptrons with competitive learning: lattice-theoretical framework and constructive learning algorithm. Inf Sci 181(10):1929–1950.  https://doi.org/10.1016/j.ins.2010.03.016 (special Issue on Information Engineering Applications Based on Lattices)MathSciNetCrossRefzbMATHGoogle Scholar
  39. Sussner P, Esmi EL (2009b) An introduction to morphological perceptrons with competitive learning. In: 2009 international joint conference on neural networks, pp 3024–3031.  https://doi.org/10.1109/IJCNN.2009.5178860
  40. Telgarsky M (2016) benefits of depth in neural networks. In: Feldman V, Rakhlin A, Shamir O (eds) 29th Annual conference on learning theory, PMLR, Columbia University, New York, New York, USA, Proceedings of machine learning research, vol 49, pp 1517–1539. http://proceedings.mlr.press/v49/telgarsky16.html. Accessed 1 Sept 2017
  41. Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE computer society conference on computer vision and pattern recognition. CVPR 2001, vol 1, pp I–511–I–518.  https://doi.org/10.1109/CVPR.2001.990517
  42. Weinberger KQ, Blitzer J, Saul LK (2006) Distance metric learning for large margin nearest neighbor classification. NIPS. MIT, CambridgeGoogle Scholar
  43. Zamora E, Sossa H (2017) Dendrite morphological neurons trained by stochastic gradient descent. Neurocomputing 260:420–431.  https://doi.org/10.1016/j.neucom.2017.04.044 CrossRefGoogle Scholar
  44. Zamora E, Sossa H (2016) Dendrite morphological neurons trained by stochastic gradient descent. In: Computational intelligence, 2016 IEEE symposium series onGoogle Scholar
  45. Zhang S, Pan X (2011) A novel text classification based on mahalanobis distance. In: Computer research and development (ICCRD), 2011 3rd international conference on, vol 3, pp 156–158.  https://doi.org/10.1109/ICCRD.2011.5764268

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  • Fernando Arce
    • 1
    Email author
  • Erik Zamora
    • 2
  • Carolina Fócil-Arias
    • 1
  • Humberto Sossa
    • 1
    • 3
  1. 1.Instituto Politécnico Nacional, CICMexico CityMexico
  2. 2.Instituto Politécnico Nacional, UPIITAMexico CityMexico
  3. 3.Tecnológico de Monterrey Campus GuadalajaraZapopanMexico

Personalised recommendations