Skip to main content

Abstract

Applications and interest on artificial neural networks (ANN) have been increasing in recent years. Applications include pattern matching, associative memory, image processing and word recognition (Simpson 1992). ANNs is a novel computing paradigm in which an artificial neuron produces an output that depends on the inputs (from other neurons), the strength or weights associated with the inputs, and an activation function.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Blayo, F. and Hurat, P., “A VLSI Systolic Array Dedicated to Hopfield Neural Networks.” VLSI for Artificial Intelligence, J. Delgado-Frias and W. Moore (Eds), pp. 255–264, Kluwer Academic Publishers, 1989.

    Chapter  Google Scholar 

  • Duranton, M., Gobert, J. and Mauduit, N., “A Digital VLSI Module for Neural Networks,” Neural Networks from Models to Applications, L. Personnas and G. Dreyfus (EDS.) Paris: I.D.S.E.T., 1989.

    Google Scholar 

  • Gnanasekaran, R., “On a Bit-Serial Input and Bit-Serial Output Multiplier,” IEEE Transactions on Computers, Vol. C-32, no. 9, pp. 878–880, 1983.

    Article  Google Scholar 

  • Hopfìeld, J., “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proceedings of the NationalAcademy of Sciences, pp. 3088–3092, May 1984.

    Google Scholar 

  • Kung, S. Y. and Hwang, J. N., “A Unified Systolic Architecture for Artificial Neural Networks,” Journal of Parallel and Distributed Computing, Vol. 6, pp. 358–387, 1989.

    Article  Google Scholar 

  • Pechanek, G.G., Vassiliadis, S., and Delgado-Frias, J. G., “Digital Neural Emulators Using Tree Accumulation and Communication Structures,” IEEE Transactions on Neural Networks, Vol. 3, no. 6, pp. 934–950, November 1992.

    Article  Google Scholar 

  • Quach, N. T. and Flynn, J., “High-Speed Addition in CMOS,” IEEE Transactions on Computers, Vol 41, no. 12, pp. 1612–1615, December 1992.

    Article  Google Scholar 

  • Rumelhart, D. E., McClelland, J. L. and the PDP Research Group, Parallel Distributed Computing, Vol 1: Foundations. Cambridge, Mass.: The MIT Press, 1986.

    Google Scholar 

  • Simpson, P. K., “Foundations of Neural Networks,” in Artificial Neural Networks: Paradigms, Applications and Hardware Implementations. E. Sánchez-Sinencio and C. Lau (Eds), New York: IEEE Press, pp. 3–24, 1992.

    Google Scholar 

  • Treleaven, P. and Vellasco, M. “Neural Networks on Silicon,” Wafer Scale Integration, III, M. Sami and F. Distante (Eds), pp. 1–10, Elsevier Science Publishers, 1990.

    Google Scholar 

  • Vassiliadis, S., “Recursive Equations for Hardwired Binary Adders,” Int. Journal of Electronics, vol. 67, no.2, pp. 201–213, 1989.

    Article  Google Scholar 

  • Vassiliadis, S., Pechanek, G. G. and Delgado-Frias, J. G., “SPIN: A Sequential Pipelined Neurocomputer,” IEEE Int Conference on Tools for Artificial Intelligence, pp. 74–81, San Jose, Calif., November 1991.

    Google Scholar 

  • Weinfeld, M., “A Fully Digital Integrated CMOS Hopfiled Network Including the Learning Algorithm,” VLSI for Artificial Intelligence. L. Delgado-Frias and W. Moore (Eds), pp. 169–178, Kluwer Academic Publishers, 1989.

    Chapter  Google Scholar 

  • Widrow, B. and Lehr, M. A., “30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation,” Proceedings of the IEEE, vol 78, no.9, pp. 1415–1442, 1990.

    Article  Google Scholar 

  • Zhang, M., Delgado-Frias, J. G., Vassiliadis, S. and Pechanek, G. G., “Hardwired Sigmoid Generator,” IBM Technical Report TR01.C492, pp. 1–38, IBM, Endicott, NY, September 1992.

    Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer Science+Business Media New York

About this chapter

Cite this chapter

Delgado-Frias, J.G., Vassiliadis, S., Pechanek, G.G., Lin, W., Barber, S.M., Ding, H. (1994). A VLSI Pipelined Neuroemulator. In: Delgado-Frias, J.G., Moore, W.R. (eds) VLSI for Neural Networks and Artificial Intelligence. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-1331-9_7

Download citation

  • DOI: https://doi.org/10.1007/978-1-4899-1331-9_7

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4899-1333-3

  • Online ISBN: 978-1-4899-1331-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics