Abstract
Multilayered feed-forward networks (perceptrons) are special cases of the general McCulloch-Pitts neural network with arbitrarily interconnected neurons. On the other hand, any general “recurrent” neural network can be considered to be represented by a feed-forward perceptron, albeit one with possibly very many layers. The reason for this strange equivalence is that the temporal evolution (3.5) of an arbitrary network constructed from binary neurons
is necessarily periodic. This statement follows immediately from the observation that the N neurons can only assume 2N configurations altogether, and hence some state of the network must reoccur after at most 2N steps. Since only the present state of the network enters on the right-hand side of the evolution law (13.1), the sub se quent evolution proceeds strictly periodically from that moment on. If one considers the neural network at a certain moment t = n as the nth layer of a perceptron (with all layers identical!), the temporal-evolution law can be viewed as the law governing the flow of information from one layer to the next. It is then sufficient to take into account only a finite number of such layers, just as many as there are time steps leading up to the first repetition of a network configuration.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1990 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Müller, B., Reinhardt, J. (1990). Coupled Neural Networks. In: Neural Networks. Physics of Neural Networks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-97239-3_13
Download citation
DOI: https://doi.org/10.1007/978-3-642-97239-3_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-97241-6
Online ISBN: 978-3-642-97239-3
eBook Packages: Springer Book Archive