Advertisement

Recognition of Sequences of Graphical Patterns

  • Edmondo Trentin
  • ShuJia Zhang
  • Markus Hagenbuchner
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5998)

Abstract

Several real-world problems (e.g., in bioinformatics/proteomics, or in recognition of video sequences) can be described as classification tasks over sequences of structured data, i.e. sequences of graphs, in a natural way. This paper presents a novel machine that can learn and carry out decision-making over sequences of graphical data. The machine involves a hidden Markov model whose state-emission probabilities are defined over graphs. This is realized by combining recursive encoding networks and constrained radial basis function networks. A global optimization algorithm which regards to the machine as a unity (instead of a bare superposition of separate modules) is introduced, via gradient-ascent over the maximum-likelihood criterion within a Baum-Welch-like forward-backward procedure. To the best of our knowledge, this is the first machine learning approach capable of processing sequences of graphs without the need of a pre-processing step. Preliminary results are reported.

Keywords

Hidden Markov model relational learning recursive networks 

References

  1. 1.
    Bengio, Y.: Neural Networks for Speech and Sequence Recognition. International Thomson Computer Press, London (1996)Google Scholar
  2. 2.
    Bianchini, M., Maggini, M., Sarti, L.: Object recognition using multiresolution trees. In: Yeung, D.-Y., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds.) SSPR 2006 and SPR 2006. LNCS, vol. 4109, pp. 331–339. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Di Massa, V., Monfardini, G., Sarti, L., Scarselli, F., Maggini, M., Gori, M.: A comparison between recursive neural networks and graph neural networks. In: World Congress on Computational Intelligence, July 2006, pp. 778–785 (2006)Google Scholar
  4. 4.
    Hagenbuchner, M., Gori, M., Bunke, H., Tsoi, A.C., Irniger, C.: Using attributed plex grammars for the generation of image and graph databases. Pattern Recognition Letters (Special issue on Graph-based Representations) 24(8), 1081–1087 (2002)Google Scholar
  5. 5.
    Haykin, S.: Neural Networks, A Comprehensive Foundation. Macmillan College Publishing Company, Inc., New York (1994)zbMATHGoogle Scholar
  6. 6.
    McLachlan, G.J., Basford, K.E. (eds.): Mixture Models: Inference and Applications to Clustering. Marcel Dekker, New York (1988)zbMATHGoogle Scholar
  7. 7.
    Rabiner, L.R.: Rabiner. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE 77(2), 257–286 (1989)CrossRefGoogle Scholar
  8. 8.
    Sperduti, A., Starita, A.: Supervised neural networks for the classification of structures. IEEE Transactions on Neural Networks 8(3), 714–735 (1997)CrossRefGoogle Scholar
  9. 9.
    Trentin, E., Rigutini, L.: A maximum-likelihood connectionist model for unsupervised learning over graphical domains. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009, Part 1. LNCS, vol. 5768, pp. 40–49. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Edmondo Trentin
    • 1
  • ShuJia Zhang
    • 2
  • Markus Hagenbuchner
    • 2
  1. 1.DIIUniversità di SienaSienaItaly
  2. 2.University of WollongongWollongongAustralia

Personalised recommendations