Skip to main content

Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria

  • Conference paper
Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005 (ICANN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3697))

Included in the following conference series:

  • 3538 Accesses

Abstract

The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from the higher complexity of the algorithm in batch mode, we know that this approach has some limitations over the sequential mode. In this paper we present a way of combining both modes when using entropic criteria. We present some experiments that validates the proposed method and we also show some comparisons of this proposed method with the single batch mode algorithm.

This work was supported by the Portuguese Fundação para a Ciência e Tecnologia(project POSI/EIA/56918/2004).

An erratum to this chapter can be found at http://dx.doi.org/10.1007/11550907_163 .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Santos, J.M., Alexandre, L.A., de Sá, J.M.: The Error Entropy Minimization Algorithm for Neural Network Classification. In: Int. Conf. on Recent Advances in Soft Computing, pp. 92–97 (2004)

    Google Scholar 

  2. Erdogmus, D., Príncipe, J.: An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems. Trans. on Signal Processing 50(7), 1780–1786 (2002)

    Article  Google Scholar 

  3. Santos, J.M., de Sá, J.M., Alexandre, L.A., Sereno, F.: Optimization of the Error Entropy Minimization Algorithm for Neural Network Classification. In: Dagli, C.H., Buczak, A.L., Enke, D.L., Embrechts, M.J., Ersoy, O. (eds.) Intelligent Engineering Systems Through Artificial Neural Networks, vol. 14, pp. 81–86. ASME Press (2004)

    Google Scholar 

  4. Silva, L.M., de Sá, J.M., Alexandre, L.A.: Neural Network Classification using Shannon’s Entropy. In: European Symposium on Artificial Neural Networks (Accepted for publication) (2005)

    Google Scholar 

  5. Xu, D., Princípe, J.: Training mlps layer-by-layer with the information potential. In: Intl. Joint Conf. on Neural Networks, pp. 1716–1720 (1999)

    Google Scholar 

  6. Santos, J.M., de Sá, J.M., Alexandre, L.A.: Neural Networks Trained with the EEM Algorithm: Tuning the Smoothing Parameter. In: 6th WSEAS Int. Conf. on Neural Networks (2005) (accepted)

    Google Scholar 

  7. Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice-Hall, New Jersey (1999)

    MATH  Google Scholar 

  8. Silva, F., Almeida, L.: Speeding up backpropagation. In: Eckmiller, R. (ed.) Advanced Neural Computers, pp. 151–158 (1990)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Santos, J.M., de Sá, J.M., Alexandre, L.A. (2005). Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds) Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005. ICANN 2005. Lecture Notes in Computer Science, vol 3697. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550907_15

Download citation

  • DOI: https://doi.org/10.1007/11550907_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28755-1

  • Online ISBN: 978-3-540-28756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics