Skip to main content

Optimizing Dissimilarity-Based Classifiers Using a Newly Modified Hausdorff Distance

  • Conference paper
Advances in Knowledge Acquisition and Management (PKAW 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4303))

Included in the following conference series:

Abstract

The aim of this paper is to present a dissimilarity measure strategy by which a new philosophy for pattern classification that pertaining to Dissimilarity-Based Classifiers (DBCs) can be efficiently implemented. DBCs, proposed by Duin and his co-authors, is not based on the feature measurements of the individual patterns, but rather on a suitable dissimilarity measure between them. The advantage of DBCs is that since it does not operate on the class-conditional distributions, the accuracy can exceed the Bayes’ error bound. The problem with this strategy, however, is that we need to measure the inter-pattern dissimilarities for all the training samples such that there is no zero distance between objects of different classes. Consequently, the classes do not overlap, and therefore, the lower error bound is zero. Thus, to achieve the desired classification accuracy, a suitable method of measuring dissimilarities is required to overcome the limitations based on the object variations. In this paper, to optimize DBCs, we suggest a newly modified Hausdorff distance measure, which determines the distance directly from the input gray-level image without extracting the binary edge image from it. Also, instead of obtaining the Hausdorff distance on the basis of the entire image, we advocate the use of a spatially weighted mask, which divides the entire image region into several subregions according to their importance. For instance, in face recognition, important regions could include eyes and mouth, while the rest is considered unimportant regions. There could also be the background region that contains no facial parts. The present experimental results, which, to the best of the authors’ knowledge, are the first reported results, demonstrate that the proposed mechanism could increase the classification accuracy when compared with the “conventional” approaches for a well-known face database.

This work was generously supported by the KOSEF, the Korea Science and Engineering Foundation (F01-2006-000-10008-0).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. and Machine Intell PAMI-22(1), 4–7 (2000)

    Article  Google Scholar 

  2. Duin, R.P.W., Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recognition Letters 18, 1159–1166 (1997)

    Article  Google Scholar 

  3. Duin, R.P.W., Pekalska, E., de Ridder, D.: Relational discriminant analysis. Pattern Recognition Letters 20, 1175–1181 (1999)

    Article  Google Scholar 

  4. Pekalska, E., Duin, R.P.W.: Dissimilarity representations allow for buiilding good classifiers. Pattern Recognition Letters 23, 943–956 (2002)

    Article  MATH  Google Scholar 

  5. Pekalska, E.: Dissimilarity representations in pattern recognition. Concepts, theory and applications. Ph.D. thesis, Delft University of Technology, Delft, The Netherlands (2005)

    Google Scholar 

  6. Horikawa, Y.: On properties of nearest neighbor classifiers for high-dimensional patterns in dissimilarity-based classification. IEICE Trans. Information & Systems J88-D-II(4), 813–817 (2005)

    Google Scholar 

  7. Pekalska, E., Duin, R.P.W., Paclik, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)

    Article  MATH  Google Scholar 

  8. Duin, R.P.W.: Personal communication

    Google Scholar 

  9. Oommen, B.J., Kim, S.-W.: On Optimizing Dissimilarity-Based Classification Using Prototype Reduction Schemes. In: Campilho, A., Kamel, M. (eds.) ICIAR 2006. LNCS, vol. 4141, pp. 15–28. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  10. Kim, S.W.: On On using a dissimilarity representation method to solve the small sample size problem for face recognition. In: Acvis 2006, Advanced Concepts for Intelligent Vision Systems, Antwerp, Belgium (September 2006)

    Google Scholar 

  11. Belhumeour, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. and Machine Intell PAMI-19(7), 711–720 (1997)

    Article  Google Scholar 

  12. Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data - with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)

    Article  MATH  Google Scholar 

  13. Howland, P., Wang, J., Park, H.: Solving the small sample size problem in face reognition using generalized discriminant analysis. Pattern Recognition 39, 277–287 (2006)

    Article  Google Scholar 

  14. Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study. International Journal of Intelligent Systems 16(12), 1445–1473 (2001)

    Article  MATH  Google Scholar 

  15. Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)

    Google Scholar 

  16. Kim, S.W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recognition 36, 1083–1093 (2003)

    Article  MATH  Google Scholar 

  17. Kim, S.W., Oommen, B.J.: Enhancing prototype reduction schemes with recursion: A method applicable for “large” data sets. IEEE Trans. Systems, Man, and Cybernetics - Part B SMC-34(3), 1384–1397 (2004)

    Article  Google Scholar 

  18. Huttenlocher, D.P., Klanderman, G.A., Rucklidge, W.J.: Comparing images using the Hausdorff distance. IEEE Trans. Pattern Anal. and Machine Intell PAMI-15(9), 850–863 (1993)

    Article  Google Scholar 

  19. Guo, B., Lam, K.M., Lin, K.H., Siu, W.C.: Human face recognition based on spatially weighted Hausdorff distance. Pattern Recognition Letters 24, 499–507 (2003)

    Article  MATH  Google Scholar 

  20. Zhao, C., Shi, W., Deng, Y.: A new Hausdorff distance for image matching. Pattern Recognition Letters 26, 581–586 (2005)

    Article  Google Scholar 

  21. Wu, B.F., Chen, Y.L., Chiu, C.C.: A discriminant analysis based recursive automatic thresholding approach for image segmentation. IEICE Trans. Inf. & Syst E88-D(7), 1716–1723 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, SW. (2006). Optimizing Dissimilarity-Based Classifiers Using a Newly Modified Hausdorff Distance. In: Hoffmann, A., Kang, Bh., Richards, D., Tsumoto, S. (eds) Advances in Knowledge Acquisition and Management. PKAW 2006. Lecture Notes in Computer Science(), vol 4303. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11961239_16

Download citation

  • DOI: https://doi.org/10.1007/11961239_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-68955-3

  • Online ISBN: 978-3-540-68957-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics