Skip to main content

Using quadratic perceptrons to reduce interconnection density in multilayer neural networks

  • Neural Network Theories, Neural Models
  • Conference paper
  • First Online:
Artificial Neural Networks (IWANN 1991)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 540))

Included in the following conference series:

Abstract

Multilayer Perceptron Nets are one of the most well known architectures for Artificial Neural Networks. The high density of interconnections among neurons however, make their VLSI-realization extremely difficult. In this paper we introduce quadratic perceptrons and show that they may lead to substantial reduction in the number of required interconnections thus improving the adequacy for integration. A Multilayer Perceptron Net with quadratic perceptrons in the first layer may be trained by using backpropagation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bryson A., Ho Y.: "Applied Optimal Control". New York, Blaisdell, (1969)

    Google Scholar 

  2. Werbos, P.: Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Dissertation, Harvard University, Cambridge MA, (1974)

    Google Scholar 

  3. Parker D.: Learning Logic. Technical Report TR-47. Center for Computational Research in Economics and Management Sciences. MIT, Cambridge MA, (1985)

    Google Scholar 

  4. Rumelhart D., Hinton G., Williams R.: Learning Internal Representations by Error Propagation. In "Parallel Distributed Processing: Explorations in the Microstructure of Cognition". Rumelhart D. and McClelland J. (Eds.), Cambridge MA, MIT Press, (1986)

    Google Scholar 

  5. Owens A., Filkin D.: Efficient training of the Back Propagation Network by solving a System of Stiff Ordinary Differential Equations. Proceedings International Conference on Neural Networks, 381–386, Washington DC, 1989

    Google Scholar 

  6. Fahlman S.E.: Fast-Learning Variations on Back-Propagation: An Empirical Study. Proceedings of the 1988 Connectionist Models Summer School (Pittsburgh 1988), 38–51. D. Touretzky, G. Hinton and T. Sejnowski (Eds.). San Mateo, Morgan Kaufmann, (1989)

    Google Scholar 

  7. Hornik K., Stinchcombe M., White H.: Multilayer Feedforward Networks are Universal Approximators. Neural Networks 2 (5), 359–366, (1989)

    Google Scholar 

  8. Minsky M.L., Papert S.A.: "Perceptrons: An Introduction to Computational Geometry". Cambridge, MA, MIT Press, (1988)

    Google Scholar 

  9. Tietze U., Schenk Ch.: "Halbleiter-Schaltungstechnik". Berlion, Heidelberg, New York, Springer Verlag, (1980).

    Google Scholar 

  10. Hassoun M.H., Arrathoon R.: Logical Signal Processing with Optically Connected Threshold Gates. Optical Engineering. 25 (1), 56–68, (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Alberto Prieto

Rights and permissions

Reprints and permissions

Copyright information

© 1991 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Röckmann, D., Moraga, C. (1991). Using quadratic perceptrons to reduce interconnection density in multilayer neural networks. In: Prieto, A. (eds) Artificial Neural Networks. IWANN 1991. Lecture Notes in Computer Science, vol 540. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0035881

Download citation

  • DOI: https://doi.org/10.1007/BFb0035881

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-54537-8

  • Online ISBN: 978-3-540-38460-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics