Abstract
In order to design probabilistic neural networks in the framework of pattern recognition we estimate class-conditional probability distributions in the form of finite mixtures of product components. As the mixture components correspond to neurons we specify the properties of neurons in terms of component parameters. The probabilistic features defined by neuron outputs can be used to transform the classification problem without information loss and, simultaneously, the Shannon entropy of the feature space is minimized. We show that, instead of dimensionality reduction, the decision problem can be simplified by using binary approximation of the probabilistic features. In experiments the resulting binary features improve recognition accuracy but also they are nearly independent - in accordance with the minimum entropy property.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Grim, J.: On numerical evaluation of maximum-likelihood estimates for finite mixtures of distributions. Kybernetika 18(3), 173–190 (1982)
Grim, J.: Multivariate statistical pattern recognition with non-reduced dimensionality. Kybernetika 22(6), 142–157 (1986)
Grim, J.: Maximum-likelihood design of layered neural networks. In: International Conference on Pattern Recognition (Proceedings), pp. 85–89. IEEE Computer Society Press, Los Alamitos (1996)
Grim, J.: Design of multilayer neural networks by information preserving transforms. In: Pessa, E., Penna, M.P., Montesanto, A. (eds.) Third European Congress on Systems Science, pp. 977–982. Edizioni Kappa, Roma (1996)
Grim, J.: Discretization of probabilistic neural networks with bounded information loss. In: Rojicek, J., et al. (eds.) Computer-Intensive Methods in Control and Data Processing (Preprints of the 3rd European IEEE Workshop CMP 1998), September 7 – 9, 1998, pp. 205–210. UTIA AV CR, Prague (1998)
Grim, J.: Information approach to structural optimization of probabilistic neural networks. In: Ferrer, L., Caselles, A. (eds.) Fourth European Congress on Systems Science, SESGE, Valencia, pp. 527–539 (1999)
Grim, J.: Self-organizing maps and probabilistic neural networks. Neural Network World 10(3), 407–415 (2000)
Grim, J., Kittler, J., Pudil, P., Somol, P.: Multiple classifier fusion in probabilistic neural networks. Pattern Analysis & Applications 5(7), 221–233 (2002)
Grim, J., Pudil, P.: On virtually binary nature of probabilistic neural networks. In: Amin, A., Dori, D., Pudil, P., Freeman, H. (eds.) Advances in Pattern Recognition. LNCS, vol. 1451, pp. 765–774. Springer, Berlin (1998)
Grim, J., Pudil, P., Somol, P.: Recognition of handwritten numerals by structural probabilistic neural networks. In: Bothe, H., Rojas, R. (eds.) Proceedings of the Second ICSC Symposium on Neural Computation, pp. 528–534. ICSC Wetaskiwin (2000)
Grim, J., Hora, J.: Recurrent Bayesian Reasoning in Probabilistic Neural Networks. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D.P. (eds.) ICANN 2007. LNCS, vol. 4669, pp. 129–138. Springer, Heidelberg (2007)
Grother, P.J.: NIST special database 19: handprinted forms and characters database, Technical Report and CD ROM (1995)
Haykin, S.: Neural Networks: a comprehensive foundation. Morgan Kaufman, San Mateo (1993)
Kohonen, T.: The Self-Organizing Maps. Springer, New York (1997)
McLachlan, G.J., Peel, D.: Finite Mixture Models. John Wiley and Sons, New York (2000)
Palm, H.Ch.: A new method for generating statistical classifiers assuming linear mixtures of Gaussian densities. In: Proc. of the 12th IAPR Int. Conf. on Pattern Recognition, Jerusalem, 1994, vol. II, pp. 483–486. IEEE Computer Soc. Press, Los Alamitos (1994)
Sammon, J.W.: A Nonlinear Mapping for Data Structure Analysis. IEEE Transactions on Computers 18(5), 401–409 (1969)
Schlesinger, M.I.: Relation between learning and self-learning in pattern recognition. Kibernetika (Kiev) 6, 81–88 (1968) (in Russian)
Specht, D.F.: Probabilistic neural networks for classification, mapping or associative memory. In: Proc. of the IEEE International Conference on Neural Networks, I, pp. 525–532 (1988)
Streit, L.R., Luginbuhl, T.E.: Maximum-likelihood training of probabilistic neural networks. IEEE Trans. on Neural Networks 5, 764–783 (1994)
Vajda, I., Grim, J.: About the maximum information and maximum likelihood principles in neural networks. Kybernetika 34, 485–494 (1998)
Watanabe, S., Fukumizu, K.: Probabilistic design of layered neural networks based on their unified framework. IEEE Trans. on Neural Networks 6(3), 691–702 (1995)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Grim, J. (2008). Extraction of Binary Features by Probabilistic Neural Networks. In: Kůrková, V., Neruda, R., Koutník, J. (eds) Artificial Neural Networks - ICANN 2008. ICANN 2008. Lecture Notes in Computer Science, vol 5164. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87559-8_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-87559-8_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87558-1
Online ISBN: 978-3-540-87559-8
eBook Packages: Computer ScienceComputer Science (R0)