Abstract
The aim of this paper is to present a dissimilarity measure strategy by which a new philosophy for pattern classification that pertaining to Dissimilarity-Based Classifiers (DBCs) can be efficiently implemented. DBCs, proposed by Duin and his co-authors, is not based on the feature measurements of the individual patterns, but rather on a suitable dissimilarity measure between them. The advantage of DBCs is that since it does not operate on the class-conditional distributions, the accuracy can exceed the Bayes’ error bound. The problem with this strategy, however, is that we need to measure the inter-pattern dissimilarities for all the training samples such that there is no zero distance between objects of different classes. Consequently, the classes do not overlap, and therefore, the lower error bound is zero. Thus, to achieve the desired classification accuracy, a suitable method of measuring dissimilarities is required to overcome the limitations based on the object variations. In this paper, to optimize DBCs, we suggest a newly modified Hausdorff distance measure, which determines the distance directly from the input gray-level image without extracting the binary edge image from it. Also, instead of obtaining the Hausdorff distance on the basis of the entire image, we advocate the use of a spatially weighted mask, which divides the entire image region into several subregions according to their importance. For instance, in face recognition, important regions could include eyes and mouth, while the rest is considered unimportant regions. There could also be the background region that contains no facial parts. The present experimental results, which, to the best of the authors’ knowledge, are the first reported results, demonstrate that the proposed mechanism could increase the classification accuracy when compared with the “conventional” approaches for a well-known face database.
This work was generously supported by the KOSEF, the Korea Science and Engineering Foundation (F01-2006-000-10008-0).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. and Machine Intell PAMI-22(1), 4–7 (2000)
Duin, R.P.W., Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recognition Letters 18, 1159–1166 (1997)
Duin, R.P.W., Pekalska, E., de Ridder, D.: Relational discriminant analysis. Pattern Recognition Letters 20, 1175–1181 (1999)
Pekalska, E., Duin, R.P.W.: Dissimilarity representations allow for buiilding good classifiers. Pattern Recognition Letters 23, 943–956 (2002)
Pekalska, E.: Dissimilarity representations in pattern recognition. Concepts, theory and applications. Ph.D. thesis, Delft University of Technology, Delft, The Netherlands (2005)
Horikawa, Y.: On properties of nearest neighbor classifiers for high-dimensional patterns in dissimilarity-based classification. IEICE Trans. Information & Systems J88-D-II(4), 813–817 (2005)
Pekalska, E., Duin, R.P.W., Paclik, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)
Duin, R.P.W.: Personal communication
Oommen, B.J., Kim, S.-W.: On Optimizing Dissimilarity-Based Classification Using Prototype Reduction Schemes. In: Campilho, A., Kamel, M. (eds.) ICIAR 2006. LNCS, vol. 4141, pp. 15–28. Springer, Heidelberg (2006)
Kim, S.W.: On On using a dissimilarity representation method to solve the small sample size problem for face recognition. In: Acvis 2006, Advanced Concepts for Intelligent Vision Systems, Antwerp, Belgium (September 2006)
Belhumeour, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. and Machine Intell PAMI-19(7), 711–720 (1997)
Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data - with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)
Howland, P., Wang, J., Park, H.: Solving the small sample size problem in face reognition using generalized discriminant analysis. Pattern Recognition 39, 277–287 (2006)
Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study. International Journal of Intelligent Systems 16(12), 1445–1473 (2001)
Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)
Kim, S.W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recognition 36, 1083–1093 (2003)
Kim, S.W., Oommen, B.J.: Enhancing prototype reduction schemes with recursion: A method applicable for “large” data sets. IEEE Trans. Systems, Man, and Cybernetics - Part B SMC-34(3), 1384–1397 (2004)
Huttenlocher, D.P., Klanderman, G.A., Rucklidge, W.J.: Comparing images using the Hausdorff distance. IEEE Trans. Pattern Anal. and Machine Intell PAMI-15(9), 850–863 (1993)
Guo, B., Lam, K.M., Lin, K.H., Siu, W.C.: Human face recognition based on spatially weighted Hausdorff distance. Pattern Recognition Letters 24, 499–507 (2003)
Zhao, C., Shi, W., Deng, Y.: A new Hausdorff distance for image matching. Pattern Recognition Letters 26, 581–586 (2005)
Wu, B.F., Chen, Y.L., Chiu, C.C.: A discriminant analysis based recursive automatic thresholding approach for image segmentation. IEICE Trans. Inf. & Syst E88-D(7), 1716–1723 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kim, SW. (2006). Optimizing Dissimilarity-Based Classifiers Using a Newly Modified Hausdorff Distance. In: Hoffmann, A., Kang, Bh., Richards, D., Tsumoto, S. (eds) Advances in Knowledge Acquisition and Management. PKAW 2006. Lecture Notes in Computer Science(), vol 4303. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11961239_16
Download citation
DOI: https://doi.org/10.1007/11961239_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-68955-3
Online ISBN: 978-3-540-68957-7
eBook Packages: Computer ScienceComputer Science (R0)