Incremental Embedding Within a Dissimilarity-Based Framework

  • Rachid HafianeEmail author
  • Luc Brun
  • Salvatore Tabbone
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9069)


Structural pattern recognition methods based on strings or graphs provide a natural encoding of objects’ relationships but can usually be combined only with a few set of machine learning methods. This last decade has seen majors advancements aiming to link these two fields. The two majors research fields in this direction concern the design of new graph and string kernels and different explicit embedding schemes of structural data. Explicit embedding of structural data can be combined with any machine learning methods. Dissimilarity representation methods are important because they allow an explicit embedding and the connection with the kernel framework. However these methods require the whole universe to be known during the learning phase and to obtain a Euclidean embedding, the matrix of dissimilarity encoding dissimilarities between any pair of objects should be regularized. This last point somehow violates the usual separation between training and test sets since both sets should be jointly processed and is an important limitation in many practical applications where the test set is unbounded and unknown during the learning phase. Moreover, requiring the whole universe represents a bottleneck for the processing of massive dataset. In this paper, we propose to overcome these limitations following an incremental embedding based on dissimilarity representations. We study in this paper, the pros and cons of two methods, which allow computing implicitly, and separately the embedding of points in the test set and in the learning set. Conclusions are set following experiments performed on different datasets.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Mahé, P., Vert, J.P.: Graph kernels based on tree patterns for molecules. Machine Learning 75, 3–35 (2008)CrossRefGoogle Scholar
  2. 2.
    Gaüzère, B., Brun, L., Villemin, D.: Two New Graphs Kernels in Chemoinformatics. Pattern Recognition Letters 33, 2038–2047 (2012)CrossRefGoogle Scholar
  3. 3.
    Calana, Y.P., Orozco-Alzate, M., Reyes, E.B.G., Duin, R.P.W.: Selecting feature lines in generalized dissimilarity representations for pattern recognition. Digital Signal Processing 23, 902–911 (2013)CrossRefMathSciNetGoogle Scholar
  4. 4.
    Pekalska, E., Duin, R.: Classifiers for dissimilarity-based pattern recognition. In: Proceedings of the 15th International Conference on Pattern Recognition, vol. 2, pp. 12–16 (2000)Google Scholar
  5. 5.
    Pekalska, E., Paclik, P., Duin, R.P.W.: A generalized kernel approach to dissimilarity-based classification. J. Mach. Learn. Res. 2, 175–211 (2002)zbMATHMathSciNetGoogle Scholar
  6. 6.
    Riesen, K., Neuhaus, M., Bunke, H.: Graph embedding in vector spaces by means of prototype selection. In: Escolano, F., Vento, M. (eds.) GbRPR 2007. LNCS, vol. 4538, pp. 383–393. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  7. 7.
    Jouili, S., Tabbone, S.: Graph Embedding Using Constant Shift Embedding. In: Ünay, D., Çataltepe, Z., Aksoy, S. (eds.) ICPR 2010. LNCS, vol. 6388, pp. 83–92. Springer, Heidelberg (2010), The original publication is available at CrossRefGoogle Scholar
  8. 8.
    Lezoray, O., Elmoataz, A., Bougleux, S.: Graph regularization for color image processing. Computer Vision and Image Understanding (CVIU) 107, 38–55 (2007)CrossRefGoogle Scholar
  9. 9.
    Cox, T.F., Cox, M.: Multidimensional Scaling, 2nd edn. Chapman and Hall/CRC (2000)Google Scholar
  10. 10.
    Dattorro, J.: Convex Optimization & Euclidean Distance Geometry. Meboo Publishing USA (2011)Google Scholar
  11. 11.
    Chin, T.J., Suter, D.: Incremental kernel principal component analysis. IEEE Transactions on Image Processing 16, 1662–1674 (2007)CrossRefMathSciNetGoogle Scholar
  12. 12.
    Roth, V., Laub, J., Kawanabe, M., Buhmann, J.M.: Optimal cluster preserving embedding of nonmetric proximity data. IEEE Trans. Pattern Anal. Mach. Intell. 25, 1540–1551 (2003)CrossRefGoogle Scholar
  13. 13.
    Hochreiter, J., Obermayer, K.: Support vector machines for dyadic data. Neural Comput. 18, 1472–1510 (2006)CrossRefzbMATHMathSciNetGoogle Scholar
  14. 14.
    Riesen, K., Bunke, H.: IAM graph database repository for graph based pattern recognition and machine learning. In: da Vitoria Lobo, N., Kasparis, T., Roli, F., Kwok, J.T., Georgiopoulos, M., Anagnostopoulos, G.C., Loog, M. (eds.) SSPR&SPR 2008. LNCS, vol. 5342, pp. 287–297. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  15. 15.
    Raveaux, R., Burie, J.C., Ogier, J.M.: A graph matching method and a graph matching distance based on subgraph assignments. Pattern Recognition Letters 31, 394–406 (2010)CrossRefGoogle Scholar
  16. 16.
    Chang, C.C., Lin, C.J.: LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 27:1–27:27 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Université de Lorraine, LORIA-UMR 7503Vandœuvre-lès-Nancy CedexFrance
  2. 2.GREYC UMR CNRS 6072ENSICAENCaen CedexFrance

Personalised recommendations