Skip to main content

Exact Hessian Matrix Calculation for Complex-Valued Neural Networks

  • Conference paper
  • First Online:
Soft Computing Applications (SOFA 2014)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 356))

Included in the following conference series:

  • 827 Accesses

Abstract

In this paper, we present the full deduction of the method to evaluate the Hessian matrix of a complex-valued feedforward neural network. The Hessian matrix is composed of the second derivatives of the error function of the network, and has many applications in network training and pruning algorithms, as well as in fast re-training of the network after a small change in training data. The software implementation of the presented method is straightforward.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Becker S, LeCun Y (1989) Improving the convergence of back-propagation learning with second-order methods. In: Touretzky D, Hinton G, Sejnowski T (eds) Proceedings of the 1988 connectionist models summer school. Morgan Kaufman, San Mateo, pp 29–37

    Google Scholar 

  2. Bishop CM (1991) A fast procedure for retraining the multilayer perceptron. Int J Neural Syst 2(3):229–236

    Article  Google Scholar 

  3. Bishop CM (1992) Exact calculation of the hessian matrix for the multi-layer perceptron. Neural Comput 4(4):494–501

    Google Scholar 

  4. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Inc., New York

    Google Scholar 

  5. Hirose A (2013) Application fields and fundamental merits of complex-valued neural networks. In: Hirose A (ed) Complex-valued neural networks, Wiley, Hoboken, pp 1–31

    Google Scholar 

  6. Kreutz-Delgado K (2009) The complex gradient operator and the CR-calculus. arXiv preprint arXiv:0906.4835

  7. LeCun Y, Denker JS, Solla S, Howard RE, Jackel LD (1990) Optimal brain damage. In: Touretzky D (ed) Advances in neural information processing systems (NIPS 1989), vol 2. Morgan Kaufman, Denver

    Google Scholar 

  8. Mackay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Comput 4(3):448–472

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Călin-Adrian Popa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Popa, CA. (2016). Exact Hessian Matrix Calculation for Complex-Valued Neural Networks. In: Balas, V., C. Jain, L., Kovačević, B. (eds) Soft Computing Applications. SOFA 2014. Advances in Intelligent Systems and Computing, vol 356. Springer, Cham. https://doi.org/10.1007/978-3-319-18296-4_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-18296-4_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-18295-7

  • Online ISBN: 978-3-319-18296-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics