Abstract
Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the metric used in the activation function (i.e. using non-symmetrical metrics). The Mahalanobis distance is such a metric, that takes into account the variability of the attributes and their correlations. However, this distance is computed directly from the variance-covariance matrix and does not consider the accuracy of the learning algorithm. In this paper, we propose to use a generalized euclidean metric, following the Mahalanobis structure, but evolved by a Genetic Algorithm (GA). This GA searches for the distance matrix that minimizes the error produced by a fixed RBNN. Our approach has been tested on two domains and positive results have been observed in both cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Moody, J.E., Darken, C.: Fast learning in networks of locally tuned processing units. Neural Computation 1, 281–294 (1989)
Ghosh, J., Nag, A.: An Overview of Radial Basis Function Networks. In: Howlett, R.J., Jain, L.C. (eds.), Physica Verlag, Heidelberg New York (2000)
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptative networks. Complex Systems 2, 321–355 (1988)
Powell, M.: The theory of radial basis function approximation in 1990. Advances in Numerical Analysis 3, 105–210 (1992)
Park, J., Sandberg, I.W.: Universal approximation and radial-basis-function networks. Neural Computation 5, 305–316 (1993)
Atkenson, C.G., Moore, A.W., Schaal, S.: Locally weighted learning. Artificial Intelligence Review 11, 11–73 (1997)
Tou, J.T., Gonzalez, R.C.: Pattern Recognition Principles. Addison-Wesley, Reading (1974)
Weisberg, S.: Aplied Linear Regression. John Wiley and Sons, New York (1985)
Babiloni, F., Bianchi, L., Semeraro, F., del R-Millan, J., Mourino, J., Cattini, A., Salinari, S., Marciani, M.G., Cincotti, F.: Mahalanobis distance-based classifiers are able to recognize eeg patterns by using few eeg electrodes. In: Engineering in Medicine and Biology Society, Proceedings of the 23rd Annual International Conference of the IEEE, vol. 1, pp. 651–654 (2001)
Holland, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press, MI (1975)
Leonardis, A., Bischof, H.: An efficient mdl-based construction of rbf networks. Neural Networks 11, 963–973 (1998)
Orr, M.J.L.: Introduction to radial basis neural networks. Technical Report. Centre for Cognitive Science, University of Edinburgh (1996)
Yingwei, L., Sundararajan, N., Saratchandran, P.: A sequential learning scheme for function approximation using minimal radial basis function neural networks. Neural Computation 9, 461–478 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Valls, J.M., Aler, R., Fernández, O. (2005). Using a Mahalanobis-Like Distance to Train Radial Basis Neural Networks. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_32
Download citation
DOI: https://doi.org/10.1007/11494669_32
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26208-4
Online ISBN: 978-3-540-32106-4
eBook Packages: Computer ScienceComputer Science (R0)