Advertisement

Modeling Parallel Optimization of the Early Stopping Method of Multilayer Perceptron

  • Maciej KrawczakEmail author
  • Sotir Sotirov
  • Evdokia Sotirova
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 657)

Abstract

Very often, overfitting of the multilayer perceptron can vary significantly in different regions of the model. Excess capacity allows better fit to regions of high, nonlinearity; and backprop often avoids overfitting the regions of low nonlinearity. The used generalized net will give us a possibility for parallel optimization of MLP based on early stopping algorithm.

References

  1. 1.
    http://www.fi.uib.no/Fysisk/Teori/NEURO/neurons.html. Neural Network Frequently Asked Questions (FAQ), The information displayed here is part of the FAQ monthly posted to comp.ai.neural-nets (1994)
  2. 2.
    Krawczak, M.: Generalized net models of systems. Bull. Polish Acad. Sci. (2003)Google Scholar
  3. 3.
    Sotirov, S.: Modeling the algorithm Backpropagation for training of neural networks with generalized nets—part 1. In: Proceedings of the Fourth International Workshop on Generalized Nets, Sofia, 23 Sept, pp. 61–67 (2003)Google Scholar
  4. 4.
    Sotirov, S., Krawczak, M.: Modeling the algorithm Backpropagation for training of neural networks with generalized nets—part 2, Issue on Intuitionistic Fuzzy Sets and Generalized nets, Warsaw (2003)Google Scholar
  5. 5.
    Hagan, M., Demuth, H., Beale, M.: Neural Network Design. PWS Publishing, Boston, MA (1996)Google Scholar
  6. 6.
    Rumelhart, D., Hinton, G., Williams, R.: Training representation by back-propagation errors. Nature 323, 533–536 (1986)CrossRefGoogle Scholar
  7. 7.
    Sotirov, S.: A method of accelerating neural network training. Neural Process. Lett. Springer 22(2), 163–169 (2005)CrossRefGoogle Scholar
  8. 8.
    Bellis, S., Razeeb, K.M., Saha, C., Delaney, K., O’Mathuna, C., Pounds-Cornish, A., de Souza, G., Colley, M., Hagras, H., Clarke, G., Callaghan, V., Argyropoulos, C., Karistianos, C., Nikiforidis, G.: FPGA implementation of spiking neural networks—an initial step towards building tangible collaborative autonomous agents, FPT’04. In: International Conference on Field-Programmable Technology, The University of Queensland, Brisbane, Australia, 6–8 Dec, pp. 449–452 (2004)Google Scholar
  9. 9.
    Haykin, S.: Neural Networks: A Comprehensive Foundation. Macmillan, NY (1994)zbMATHGoogle Scholar
  10. 10.
    Atanassov, K.: Generalized Nets. World Scientific, Singapore (1991)CrossRefzbMATHGoogle Scholar
  11. 11.
    Gadea, R., Ballester, F., Mocholi, A., Cerda, J.: Artificial neural network implementation on a single FPGA of a pipelined on-line Backpropagation. In: 13th International Symposium on System Synthesis (ISSS’00), pp. 225–229 (2000)Google Scholar
  12. 12.
    Maeda, Y., Tada, T.: FPGA Implementation of a pulse density neural network with training ability using simultaneous perturbation. IEEE Trans. Neural Netw. 14(3) (2003)Google Scholar
  13. 13.
    Geman, S., Bienenstock, E., Doursat, R.: Neural networks and the bias/variance dilemma. Neural Comput. 4, 1–58 (1992)CrossRefGoogle Scholar
  14. 14.
    Beale, M.H., Hagan, M.T., Demuth, H.B.: Neural Network Toolbox User’s Guide R2012a (1992–2012)Google Scholar
  15. 15.
    Morgan, N.: H, pp. 630–637. Bourlard, Generalization and parameter estimation in feedforward nets (1990)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Maciej Krawczak
    • 1
    Email author
  • Sotir Sotirov
    • 2
  • Evdokia Sotirova
    • 2
  1. 1.Higher School of Applied Informatics and ManagementWarsawPoland
  2. 2.Asen Zlatarov UniversityBurgasBulgaria

Personalised recommendations