Resistant Neural Network Learning via Resistant Empirical Risk Minimization

  • Zaur M. ShibzukhovEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11554)


The article proposes an extended version of the principle of minimizing the empirical risk for training neural networks that is stable with respect to a large number of outliers in the training data. It is based on the use of -averaging and -averaging functions instead of arithmetic mean for estimating empirical risk. An iteratively re-weighted scheme is proposed for minimizing differentiable resistant estimates of mean loss functions. This schema allows to use weighted version of traditional back-propagation algorithms for neural networks learning in presence of large number of outliers.


Neural networks Robust estimation Resistant averaging function 


  1. 1.
    Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2, 359–366 (1989)Google Scholar
  2. 2.
    Maronna, R., Martin, R., Yohai, V.: Robust Statistics: Theory and Methods. Wiley, New York (2006)Google Scholar
  3. 3.
    Rousseeuw, P.J., Leroy, A.M.: Robust Regression and Outlier Detection. Wiley, New York (2003)Google Scholar
  4. 4.
    Rousseeuw, P.J.: Least median of squares regression. J. Am. Stat. Assoc. 79, 871–880 (1984)Google Scholar
  5. 5.
    Huber, P.J.: Robust Statistics. Wiley, New York (1981)Google Scholar
  6. 6.
    Stromberg, A.J., Ruppert, D.: Breakdown in nonlinear regression. J. Am. Stat. Assoc. 87, 991–997 (1992)Google Scholar
  7. 7.
    Holland, P.W., Welsch, R.E.: Robust regression using iteratively reweighted least-squares. Commun. Stat. Theory Methods 6, 813–827 (1977)Google Scholar
  8. 8.
    Chen, D.S., Jain, R.C.: A robust back-propagation learning algorithm for function approximation. IEEE Trans. Neural Netw. 5, 467–479 (1994)Google Scholar
  9. 9.
    Liano, K.: Robust error measure for supervised neural network learning with outliers. IEEE Trans. Neural Netw. 7, 246–250 (1996)Google Scholar
  10. 10.
    El-Melegy, M.T., Essai, M.H., Ali, A.A.: Robust training of artificial feedforward neural networks. In: Hassanien, A.E., Abraham, A., Vasilakos, A.V., Pedrycz, W. (eds.) Foundations of Computational, Intelligence Volume 1. Studies in Computational Intelligence, vol. 201, pp. 217–242. Springer, Heidelberg (2009). Scholar
  11. 11.
    Rusiecki, A.: Robust LTS backpropagation learning algorithm. In: Sandoval, F., Prieto, A., Cabestany, J., Graña, M. (eds.) IWANN 2007. LNCS, vol. 4507, pp. 102–109. Springer, Heidelberg (2007). Scholar
  12. 12.
    Jeng, J.-T., Chuang, C.-T., Chuang, C.-C.: Least trimmed squares based CPBUM neural networks. In: International Conference on System Science and Engineering, pp. 187–192 (2011)Google Scholar
  13. 13.
    Rusiecki, A.: Robust learning algorithm based on iterative least median of squares. Neural Process. Lett. 36, 145–160 (2012)Google Scholar
  14. 14.
    Lina, Y.-L., Hsiehb, J.-G., Jenga, J.-H., Cheng, W.-C.: On least trimmed squares neural networks. Neurocomputing 61, 107–112 (2015)Google Scholar
  15. 15.
    Beliakov, G., Kelarev, A., Yearwood, J.: Robust artificial neural networks and outlier detection (2012).
  16. 16.
    Shibzukhov, Z.M.: On the principle of empirical risk minimization based on averaging aggregation functions. Dokl. Math. 96(2), 494–497 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Mathematics and Informatics Institute of Moscow Pedagogical State UniversityMoscowRussia
  2. 2.Institute of Applied Mathematics and Automation KBSC RASNalchikRussia

Personalised recommendations