Skip to main content

Singular Value Decomposition and Neural Networks

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning (ICANN 2019)

Abstract

Singular Value Decomposition (SVD) constitutes a bridge between the linear algebra concepts and multi-layer neural networks—it is their linear analogy. Besides of this insight, it can be used as a good initial guess for the network parameters, leading to substantially better optimization results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Since we use TensorFlow as Keras’ backend execution engine, the resulting computation graph would have been cut into two different executions for each optimization step which causes a too high computational overhead.

References

  1. Chollet, F., et al.: Keras (2015). https://keras.io

  2. Jones, E., Oliphant, T., Peterson, P., et al.: SciPy: open source scientific tools for Python (2001). http://www.scipy.org/

  3. Kohonen, T.: Self-Organization and Associative Memory. SSINF, vol. 8, 3rd edn. Springer, Heidelberg (1989). https://doi.org/10.1007/978-3-642-88163-3. https://www.springer.com/de/book/9783540513872

    Book  MATH  Google Scholar 

  4. McLoone, S., Brown, M.D., Irwin, G., Lightbody, A.: A hybrid linear/nonlinear training algorithm for feedforward neural networks. IEEE Trans. Neural Networks 9(4), 669–684 (1998). https://doi.org/10/dwvsrs

    Article  Google Scholar 

  5. Saxe, A.M., McClelland, J.L., Ganguli, S.: Exact solutions to the nonlinear dynamics of learning in deep linear neural networks. arXiv 1312.6120 (2013). http://arxiv.org/abs/1312.6120

  6. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv 1409.1556, September 2014. http://arxiv.org/abs/1409.1556

  7. Trefethen, L.N., Bau III, D.: Numerical Linear Algebra, vol. 50. SIAM (1997). ISBN 978-0-898713-61-9

    Google Scholar 

  8. Xue, J., Li, J., Gong, Y.: Restructuring of deep neural network acoustic models with singular value decomposition. In: Bimbot, F., et al. (eds.) INTERSPEECH 2013, 14th Annual Conference of the International Speech Communication Association, Lyon, France, 25–29 August 2013, pp. 2365–2369. ISCA (2013). https://isca-speech.org/archive/interspeech_2013/i13_2365.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bernhard Bermeitinger .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bermeitinger, B., Hrycej, T., Handschuh, S. (2019). Singular Value Decomposition and Neural Networks. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning. ICANN 2019. Lecture Notes in Computer Science(), vol 11728. Springer, Cham. https://doi.org/10.1007/978-3-030-30484-3_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30484-3_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30483-6

  • Online ISBN: 978-3-030-30484-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics