Abstract
Applications and interest on artificial neural networks (ANN) have been increasing in recent years. Applications include pattern matching, associative memory, image processing and word recognition (Simpson 1992). ANNs is a novel computing paradigm in which an artificial neuron produces an output that depends on the inputs (from other neurons), the strength or weights associated with the inputs, and an activation function.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Blayo, F. and Hurat, P., “A VLSI Systolic Array Dedicated to Hopfield Neural Networks.” VLSI for Artificial Intelligence, J. Delgado-Frias and W. Moore (Eds), pp. 255–264, Kluwer Academic Publishers, 1989.
Duranton, M., Gobert, J. and Mauduit, N., “A Digital VLSI Module for Neural Networks,” Neural Networks from Models to Applications, L. Personnas and G. Dreyfus (EDS.) Paris: I.D.S.E.T., 1989.
Gnanasekaran, R., “On a Bit-Serial Input and Bit-Serial Output Multiplier,” IEEE Transactions on Computers, Vol. C-32, no. 9, pp. 878–880, 1983.
Hopfìeld, J., “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proceedings of the NationalAcademy of Sciences, pp. 3088–3092, May 1984.
Kung, S. Y. and Hwang, J. N., “A Unified Systolic Architecture for Artificial Neural Networks,” Journal of Parallel and Distributed Computing, Vol. 6, pp. 358–387, 1989.
Pechanek, G.G., Vassiliadis, S., and Delgado-Frias, J. G., “Digital Neural Emulators Using Tree Accumulation and Communication Structures,” IEEE Transactions on Neural Networks, Vol. 3, no. 6, pp. 934–950, November 1992.
Quach, N. T. and Flynn, J., “High-Speed Addition in CMOS,” IEEE Transactions on Computers, Vol 41, no. 12, pp. 1612–1615, December 1992.
Rumelhart, D. E., McClelland, J. L. and the PDP Research Group, Parallel Distributed Computing, Vol 1: Foundations. Cambridge, Mass.: The MIT Press, 1986.
Simpson, P. K., “Foundations of Neural Networks,” in Artificial Neural Networks: Paradigms, Applications and Hardware Implementations. E. Sánchez-Sinencio and C. Lau (Eds), New York: IEEE Press, pp. 3–24, 1992.
Treleaven, P. and Vellasco, M. “Neural Networks on Silicon,” Wafer Scale Integration, III, M. Sami and F. Distante (Eds), pp. 1–10, Elsevier Science Publishers, 1990.
Vassiliadis, S., “Recursive Equations for Hardwired Binary Adders,” Int. Journal of Electronics, vol. 67, no.2, pp. 201–213, 1989.
Vassiliadis, S., Pechanek, G. G. and Delgado-Frias, J. G., “SPIN: A Sequential Pipelined Neurocomputer,” IEEE Int Conference on Tools for Artificial Intelligence, pp. 74–81, San Jose, Calif., November 1991.
Weinfeld, M., “A Fully Digital Integrated CMOS Hopfiled Network Including the Learning Algorithm,” VLSI for Artificial Intelligence. L. Delgado-Frias and W. Moore (Eds), pp. 169–178, Kluwer Academic Publishers, 1989.
Widrow, B. and Lehr, M. A., “30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation,” Proceedings of the IEEE, vol 78, no.9, pp. 1415–1442, 1990.
Zhang, M., Delgado-Frias, J. G., Vassiliadis, S. and Pechanek, G. G., “Hardwired Sigmoid Generator,” IBM Technical Report TR01.C492, pp. 1–38, IBM, Endicott, NY, September 1992.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1994 Springer Science+Business Media New York
About this chapter
Cite this chapter
Delgado-Frias, J.G., Vassiliadis, S., Pechanek, G.G., Lin, W., Barber, S.M., Ding, H. (1994). A VLSI Pipelined Neuroemulator. In: Delgado-Frias, J.G., Moore, W.R. (eds) VLSI for Neural Networks and Artificial Intelligence. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-1331-9_7
Download citation
DOI: https://doi.org/10.1007/978-1-4899-1331-9_7
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-1333-3
Online ISBN: 978-1-4899-1331-9
eBook Packages: Springer Book Archive