Abstract
This chapter begins by briefly summarising some of the well-known classical connectionist/artificial neural network models such as multi-layered perceptron neural networks (MLP-NNs), radial basis function neural networks (RBF-NNs), self-organising feature maps (SOFMs), associative memory, and Hopfield-type recurrent neural networks (HRNNs). These models are shown to normally require iterative and/or complex parameter approximation procedures, and it is highlighted why these approaches have in general lost interest in modelling the psychological functions and developing artificial intelligence (in a more realistic sense).
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
About this chapter
Cite this chapter
Hoya, T. From Classical Connectionist Models to Probabilistic/Generalised Regression Neural Networks (PNNs/GRNNs) . In: Artificial Mind System - Kernel Memory Approach. Studies in Computational Intelligence, vol 1. Springer, Berlin, Heidelberg. https://doi.org/10.1007/10997444_2
Download citation
DOI: https://doi.org/10.1007/10997444_2
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26072-1
Online ISBN: 978-3-540-32403-4
eBook Packages: EngineeringEngineering (R0)