Advertisement

High Generalization Capability Artificial Neural Network Architecture Based on RBF-Network

  • Mikhail Abrosimov
  • Alexander BrovkoEmail author
Conference paper
Part of the Studies in Systems, Decision and Control book series (SSDC, volume 199)

Abstract

This paper describes the issue of error level fluctuations due to training set shrinking in RBF-networks. An architecture of artificial neural network (ANN) based on RBF-network is presented with a learning algorithm to train it. The presented architecture is multi-layer, unlike original RBF-network thus has a potential in deep learning. Numeric results lead to a conclusion about error level fluctuations being significantly lower for the presented architecture compared to RBF-network in case of training set shrinking. This displays a greater generalization ability of the presented architecture. The paper contains an application of ANN to the task of restoring the dielectric parameters for subject placed in waveguide.

Keywords

Artificial neural network Neural network learning algorithm RBF neural network 

References

  1. 1.
    Sermanet, P., Eigen, D., Zhang, X., Mathieu, M., Fergus, R., LeCun, Y.: OverFeat: integrated recognition, localization and detection using convolutional networks. In: International Conference on Learning Representations (ICLR2014), CBLS, (arXiv:1312.6229) (2014)
  2. 2.
    LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)CrossRefGoogle Scholar
  3. 3.
    Brovko, A.V., Murphy, E.K., Yakovlev, V.V.: Waveguide microwave imaging: neural network reconstruction of functional 2-D permittivity profiles. IEEE Trans. Microw. Theory Tech. 57(2), 406–414 (2009).  https://doi.org/10.1109/TMTT.2008.2011203CrossRefGoogle Scholar
  4. 4.
    Yakovlev, V.V., Murphy, E.K., Eves, E.E.: Neural networks for FDTD-backed permittivity reconstruction. COMPEL: Int. J. Comput. Math. Electr. Electron. Eng. 24(1), 291–304 (2005)CrossRefGoogle Scholar
  5. 5.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, Englewood Cliffs (1999)zbMATHGoogle Scholar
  6. 6.
    Powell, M.J.D.: Approximation Theory and Methods. Cambridge University Press, New York (1981). http://doi.org/CBO9781139171502CrossRefGoogle Scholar
  7. 7.
    Abrosimov, M.A., Brovko, A.V.: Criteria for minimal RBF artificial neural network’s center vectors set definition. Neirokomputery: razrabotka, primenenie. (3), 50–54 (2018). (in Russian)Google Scholar
  8. 8.
    Bianchini, M., Frasconi, P., Gori, M.: Learning without local minima in radial basis function networks. IEEE Trans. Neural Netw. 6(3), 749–756 (1995).  https://doi.org/10.1109/72.377979CrossRefGoogle Scholar
  9. 9.
    Craddock, R.J., Warwick, K.: Multi-layer radial basis function networks. an extension to the radial basis function. In: IEEE Proceedings of International Conference on Neural Networks (ICNN 1996), vol. 2, pp. 700–705 (1996).  https://doi.org/10.1109/ICNN.1996.548981
  10. 10.
    Chao, J., Hoshino, M., Kitamura, T., Masuda, T.: A multilayer RBF network and its supervised learning. In: Proceedings of IEEE International Joint Conference on Neural Networks, IJCNN 2001 (Cat. No. 01CH37222), vol. 3, pp. 1995–2000 (2001).  https://doi.org/10.1109/IJCNN.2001.938470

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Yuri Gagarin State Technical University of SaratovSaratovRussia

Personalised recommendations