Encyclopedia of Database Systems

2018 Edition
| Editors: Ling Liu, M. Tamer Özsu

Neural Networks

  • Pang-Ning TanEmail author
Reference work entry
DOI: https://doi.org/10.1007/978-1-4614-8265-9_560


Connectionist model; Parallel distributed processing


An artificial neural network (ANN) is an abstract computational model designed to solve a variety of supervised and unsupervised learning tasks. While the discussion in this chapter focuses only on supervised classification, readers who are interested in unsupervised learning using ANN may refer to the literature on vector quantization [ 1] and self-organizing maps [ 2]. An ANN consists of an assembly of simple processing units called neurons connected by a set of weighted edges (or synapses), as shown in Fig.  1. The neurons are often configured into a feed-forward multilayered topology, with outputs from one layer being fed into the next layer. The first layer, which is known as the input layer, encodes the attributes of the input data, while the last layer, known as the output layer, encodes the neural network’s output. Hidden layers are the intermediary layers of neurons between the input and output layers. A...
This is a preview of subscription content, log in to check access.

Recommended Reading

  1. 1.
    Haykin S. Neural networks – a comprehensive foundation. 2nd ed. Englewood: Prentice-Hall; 1998.zbMATHGoogle Scholar
  2. 2.
    Kohonen T. Self-organizing maps. Berlin: Springer; 2001.zbMATHCrossRefGoogle Scholar
  3. 3.
    Rosenblatt F. Principles of neurodynamics. New York: Spartan Books; 1959.Google Scholar
  4. 4.
    McCulloch WS, Pitts W. A logical calculus of the ideas immanent in nervous activity. Bull Math Biophys. 1943;5(4):115–33.MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Hebb D. The organization of behaviour. New York: Wiley; 1949.Google Scholar
  6. 6.
    Widrow B, Hoff ME Jr. Adaptive switching circuits. IRE WESCON convention record; 1960. p. 96–104.Google Scholar
  7. 7.
    Amari S. A theory of adaptive pattern classifiers. IEEE Trans Electron Comput. 1967;16(3):299–307.zbMATHCrossRefGoogle Scholar
  8. 8.
    Minsky M, Papert S. Perceptrons: An introduction to computational geometry. Cambridge, MA: MIT; 1969.zbMATHGoogle Scholar
  9. 9.
    Ackley DH, Hinton GE, Sejnowski TJ. A learning algorithm for Boltzmann machines. Cogn Sci. 1985;9(1):147–69.CrossRefGoogle Scholar
  10. 10.
    Rumelhart DE, Hinton GE, Williams RJ. Learning representations by backpropagating errors. Nature. 1986;323(6088):533–6.zbMATHCrossRefGoogle Scholar
  11. 11.
    Powell M. Radial basis functions for multivariable interpolation: A review. In: Mason JC, Cox MG, editors. Algorithms for approximation. New York: Clarendon Press; 1987. p. 143–67.Google Scholar
  12. 12.
    Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci. 1982;79(8):2554–8.MathSciNetzbMATHCrossRefGoogle Scholar
  13. 13.
    Jordan MI. Attractor dynamics and parallelism in a connectionist sequential machine. In: Proceedings of the 8th Annual Conference of the Cognitive Science Society; 1986. p. 531–46.Google Scholar
  14. 14.
    Elman JL. Finding structure in time. Cogn Sci. 1990;14(2):179–211.CrossRefGoogle Scholar
  15. 15.
    Lampinen J, Vehtari A. Bayesian approach for neural networks – review and case studies. Neural Netw. 2001;14(3):7–24.CrossRefGoogle Scholar
  16. 16.
    Bengio Y, Courville A, Vincent P. Representation learning: A review and new perspectives. IEEE Trans Pattern Anal Mach Intell. 2003;35(8):1798–828.CrossRefGoogle Scholar
  17. 17.
    Craven M, Shavlik JW. Learning symbolic rules using artificial neural networks. In: Proceedings of the 10th International Conference on Machine Learning; 1993. p. 73–80.CrossRefGoogle Scholar
  18. 18.
    Fahlman SE, Lebiere C. The cascade correlation learning architecture. Adv Neural Inf Process Syst. 1989;2:524–32.Google Scholar
  19. 19.
    Hopfield JJ. Learning algorithms and probability distributions in feed-forward and feed-back networks. Proc Natl Acad Sci U S A. 1987;84(23):8429–33.MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Michigan State UniversityEast LansingUSA

Section editors and affiliations

  • Kyuseok Shim
    • 1
  1. 1.School of Elec. Eng. and Computer ScienceSeoul National Univ.SeoulRepublic of Korea