The Quadratic-Chi Histogram Distance Family

  • Ofir Pele
  • Michael Werman
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6312)


We present a new histogram distance family, the Quadratic-Chi (QC). QC members are Quadratic-Form distances with a cross-bin χ 2-like normalization. The cross-bin χ 2-like normalization reduces the effect of large bins having undo influence. Normalization was shown to be helpful in many cases, where the χ 2 histogram distance outperformed the L 2 norm. However, χ 2 is sensitive to quantization effects, such as caused by light changes, shape deformations etc. The Quadratic-Form part of QC members takes care of cross-bin relationships (e.g. red and orange), alleviating the quantization problem. We present two new cross-bin histogram distance properties: Similarity-Matrix-Quantization-Invariance and Sparseness-Invariance and show that QC distances have these properties. We also show that experimentally they boost performance. QC distances computation time complexity is linear in the number of non-zero entries in the bin-similarity matrix and histograms and it can easily be parallelized. We present results for image retrieval using the Scale Invariant Feature Transform (SIFT) and color image descriptors. In addition, we present results for shape classification using Shape Context (SC) and Inner Distance Shape Context (IDSC). We show that the new QC members outperform state of the art distances for these tasks, while having a short running time. The experimental results show that both the cross-bin property and the normalization are important.


Image Retrieval Similarity Matrix Query Image Scale Invariant Feature Transform Shape Context 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Hafner, J., Sawhney, H., Equitz, W., Flickner, M., Niblack, W.: Efficient color histogram indexing for quadratic form distance functions. PAMI (1995)Google Scholar
  2. 2.
    Rubner, Y., Tomasi, C., Guibas, L.J.: The earth mover’s distance as a metric for image retrieval. IJCV (2000)Google Scholar
  3. 3.
    Pele, O., Werman, M.: A linear time histogram metric for improved sift matching. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part III. LNCS, vol. 5304, pp. 495–508. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Snedecor, G., Cochran, W.: Statistical Methods, Ames, Iowa, 6th edn. (1967)Google Scholar
  5. 5.
    Cula, O., Dana, K.: 3D texture recognition using bidirectional feature histograms. IJCV (2004)Google Scholar
  6. 6.
    Zhang, J., Marszalek, M., Lazebnik, S., Schmid, C.: Local features and kernels for classification of texture and object categories: A comprehensive study. IJCV (2007)Google Scholar
  7. 7.
    Varma, M., Zisserman, A.: A statistical approach to material classification using image patch exemplars. PAMI (2009)Google Scholar
  8. 8.
    Xu, D., Cham, T., Yan, S., Duan, L., Chang, S.: Near Duplicate Identification with Spatially Aligned Pyramid Matching. In: CSVT (accepted)Google Scholar
  9. 9.
    Forssén, P., Lowe, D.: Shape Descriptors for Maximally Stable Extremal Regions. In: ICCV (2007)Google Scholar
  10. 10.
    Belongie, S., Malik, J., Puzicha, J.: Shape matching and object recognition using shape contexts. PAMI (2002)Google Scholar
  11. 11.
    Ling, H., Jacobs, D.: Shape classification using the inner-distance. PAMI (2007)Google Scholar
  12. 12.
    Martin, D., Fowlkes, C., Malik, J.: Learning to detect natural image boundaries using local brightness, color, and texture cues. PAMI (2004)Google Scholar
  13. 13.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. IJCV (2004)Google Scholar
  14. 14.
    Pele, O., Werman, M.: Fast and robust earth mover’s distances. In: ICCV (2009)Google Scholar
  15. 15.
    Ling, H., Okada, K.: An Efficient Earth Mover’s Distance Algorithm for Robust Histogram Comparison. PAMI (2007)Google Scholar
  16. 16.
    Ling, H., Okada, K.: Diffusion distance for histogram comparison. In: CVPR (2006)Google Scholar
  17. 17.
    Bhattacharyya, A.: On a measure of divergence between two statistical populations defined by their probability distributions. BCMS (1943)Google Scholar
  18. 18.
    Kullback, S., Leibler, R.: On information and sufficiency. AMS (1951)Google Scholar
  19. 19.
    Lin, J.: Divergence measures based on the Shannon entropy. IT (1991)Google Scholar
  20. 20.
    Pele, O., Werman, M.: The quadratic-chi histogram distance family - appendices (2010),
  21. 21.
    Jacobs, D., Weinshall, D., Gdalyahu, Y.: Classification with nonmetric distances: Image retrieval and class representation. PAMI (2000)Google Scholar
  22. 22.
    D’Agostino, M., Dardanoni, V.: What‘s so special about Euclidean distance? SCW (2009)Google Scholar
  23. 23.
    Rubner, Y., Puzicha, J., Tomasi, C., Buhmann, J.: Empirical evaluation of dissimilarity measures for color and texture. CVIU (2001)Google Scholar
  24. 24.
    Luo, M., Cui, G., Rigg, B.: The Development of the CIE 2000 Colour-Difference Formula: CIEDE2000. CRA (2001)Google Scholar
  25. 25.
    Ruzon, M., Tomasi, C.: Edge, Junction, and Corner Detection Using Color Distributions. PAMI (2001)Google Scholar
  26. 26.
    Wang, J., Li, J., Wiederhold, G.: SIMPLIcity: Semantics-Sensitive Integrated Matching for Picture LIbraries. PAMI (2001)Google Scholar
  27. 27.
    Sharma, G., Wu, W., Dalal, E.: The CIEDE2000 color-difference formula: implementation notes, supplementary test data, and mathematical observations. CRA (2005)Google Scholar
  28. 28.
    Ling, H.: Articulated shape benchmark and idsc code (2010),
  29. 29.
    Guisewite, G., Pardalos, P.: Minimum concave-cost network flow problems: Applications, complexity, and algorithms. AOR (1990)Google Scholar
  30. 30.
    Amiri, A., Pirkul, H.: New formulation and relaxation to solve a concave-cost network flow problem. JORS (1997)Google Scholar
  31. 31.
    Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning with application to clustering with side-information. In: NIPS (2003)Google Scholar
  32. 32.
    Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning distance functions using equivalence relations. In: ICML (2003)Google Scholar
  33. 33.
    Goldberger, J., Roweis, S., Hinton, G., Salakhutdinov, R.: Neighbourhood components analysis. In: NIPS (2005)Google Scholar
  34. 34.
    Globerson, A., Roweis, S.: Metric learning by collapsing classes. In: NIPS (2006)Google Scholar
  35. 35.
    Yang, L., Jin, R.: Distance metric learning: A comprehensive survey. MSU (2006)Google Scholar
  36. 36.
    Davis, J., Kulis, B., Jain, P., Sra, S., Dhillon, I.: Information-theoretic metric learning. In: ICML (2007)Google Scholar
  37. 37.
    Yu, J., Amores, J., Sebe, N., Radeva, P., Tian, Q.: Distance learning for similarity estimation. PAMI (2008)Google Scholar
  38. 38.
    Weinberger, K., Saul, L.: Distance metric learning for large margin nearest neighbor classification. JMLR (2009)Google Scholar
  39. 39.
    Assent, I., Wichterich, M., Seidl, T.: Adaptable Distance Functions for Similarity-based Multimedia Retrieval. DSN (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Ofir Pele
    • 1
  • Michael Werman
    • 1
  1. 1.School of Computer ScienceThe Hebrew University of Jerusalem 

Personalised recommendations