Abstract
The N-Tuple Neural Network (NTNN) is a fast, efficient memory-based neural network capable of performing non-linear function approximation and pattern classification. The random nature of the N-tuple sampling of the input vectors makes precise analysis difficult. Here, the NTNN is considered within a unifying framework of the General Memory Neural Network (GMNN) — a family of networks which include such important types as radial basis function networks. By discussing the NTNN within such a framework, a clearer understanding of its operation and efficient application can be gained. The nature of the intrinsic tuple distances, and the resultant kernel, is also discussed, together with techniques for handling non-binary input patterns. An example of a tuple-based network, which is a simple extension of the conventional NTNN, is shown to yield the best estimate of the underlying regression function, E(Y|x), for a finite training set. Finally, the pattern classification capabilities of the NTNN are considered.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bledsoe W W and Browning I, Pattern recognition and reading by machine, IRE Joint Computer Conference, 1959, 225–232.
Aleksander I, Fused adative circuit which learns by example, Electronics Letters, 1965, 1, 173–174.
Tattersall G D, Foster S and Johnston R D, Single-layer lookup perceptrons, IEE Proceedings — F: Radar and Signal Processing, 1991, 138, 46–54.
Bledsoe W W and Bisson C L, Improved memory matrices for the N-tuple pattern recognition method, IRE Transactions on Electronic Computers, 1962, 11, 414–415.
Broomhead D S and Lowe D, Multivariable functional interpolation and adaptive networks, Complex Systems, 1988, 2, 321–355.
Specht DF, A general regression neural network, IEEE Transactions on Neural Networks, 1991, 2, 568–576.
Albus JS, A new approach to manipulator control: the cerebellar model articulation controller (CMAC), Journal of Dynamic Systems, Measurement and Control, 1975, 97, 220–227.
Moody J, Fast learning in multi-resolution hierarchies, in Advances in Neural Information Processing 1 (Touretzky D S, ed.), 1989, Morgan Kaufmann: San Mateo, CA, 29–39.
Kolcz A and Allinson N M, General Memory Neural Network — extending the properties of basis networks to RAM-based architectures, 1995 IEEE International Conference on Neural Networks, Perth, Western Australia.
Park J and Sandberg, I W, Universal approximation using radial basis function networks, Neural Computation, 1991, 3, 246–257.
Kavli T, ASMOD — an algorithm for adaptive modelling of observational data, International Journal of Control, 1993, 58, 947–967.
Kolcz A and Allinson N M, Application of the CMAC-input encoding scheme in the N-tuple approximation network, IEE Proceedings — E Computers and Digital Techniques, 1994, 141, 177–183.
Kolcz A and Allinson N M, Enhanced N-tuple approximators, Proceedings of Weightless Neural Network Workshop 93, 1993, 38–45.
Allinson N M and Kolcz A, The theory and practice of N-tuple neural networks, in Neural Networks (Taylor J G, ed.), 1995, Alfred Waller, 53–70.
Kolcz A and Allinson N M, Euclidean mapping in an N-tuple approximation network, Sixth IEEE Digital Signal Processing Workshop, 1994, 285–289.
Kolcz A and Allinson N M, N-tuple regression network, Neural Networks vol 9 No.5 pp855–870.
Parzen E, On estimation of a probability density function and mode, Annals of Mathematical Statistics, 1962, 33, 1065–1076.
Moody J and Darken C J, Fast learning in networks of locally-tuned processing units, Neural Computation, 1989, 1, 281–294.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer Science+Business Media New York
About this chapter
Cite this chapter
Allinson, N.M., Kolcz, A.R. (1997). N-Tuple Neural Networks. In: Ellacott, S.W., Mason, J.C., Anderson, I.J. (eds) Mathematics of Neural Networks. Operations Research/Computer Science Interfaces Series, vol 8. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-6099-9_1
Download citation
DOI: https://doi.org/10.1007/978-1-4615-6099-9_1
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7794-8
Online ISBN: 978-1-4615-6099-9
eBook Packages: Springer Book Archive