Multivariate Quantitative Approximation by Perturbed Kantorovich–Shilkret Neural Network Operators

  • George A. AnastassiouEmail author
Part of the Studies in Systems, Decision and Control book series (SSDC, volume 190)


This chapter deals with the determination of the rate of convergence to the unit of Perturbed Kantorovich–Shilkret multivariate normalized neural network operators of one hidden layer. These are given through the multivariate modulus of continuity of the engaged multivariate function or its high order partial derivatives and that appears in the associated multivariate Jackson type inequalities. The activation function is very general and it can derive from any multivariate sigmoid or bell-shaped functions. The right hand sides of our Jackson type inequalities do not depend on the activation function. The sample functionals are Kantorovich–Shilkret type. We provide an application for the first partial derivatives of the involved function. It follows [4].


  1. 1.
    G.A. Anastassiou, Quantitative Approximations (Chapman and Hall/CRC, Boca Raton, 2001)Google Scholar
  2. 2.
    G.A. Anastassiou, Intelligent Systems: Approximation by Artificial Neural Networks (Springer, Heidelberg, 2011)CrossRefGoogle Scholar
  3. 3.
    G.A. Anastassiou, Intelligent Systems II: Complete Approximation by Neural Network operators (Springer, Heidelberg, 2016)CrossRefGoogle Scholar
  4. 4.
    G. Anastassiou, Multivariate approximation with rates by perturbed Kantorovich-Shilkret neural network operators (2018). SubmittedGoogle Scholar
  5. 5.
    P. Cardaliaguet, G. Euvrard, Approximation of a function and its derivative with a neural network. Neural Netw. 5, 207–220 (1992)CrossRefGoogle Scholar
  6. 6.
    N. Shilkret, Maxitive measure and integration. Indag. Math. 33, 109–116 (1971)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Mathematical SciencesUniversity of MemphisMemphisUSA

Personalised recommendations