Advertisement

Dynamic Trees: Learning to Model Outdoor Scenes

  • Nicholas J. Adams
  • Christopher K. I. Williams
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2353)

Abstract

This paper considers the dynamic tree (DT) model, first introduced in [1]. A dynamic tree specifies a prior over structures of trees, each of which is a forest of one or more tree-structured belief networks (TSBN). In the literature standard tree-structured belief network models have been found to produce “blocky” segmentations when naturally occurring boundaries within an image did not coincide with those of the subtrees in the fixed structure of the network. Dynamic trees have a flexible architecture which allows the structure to vary to create configurations where the subtree and image boundaries align, and experimentation with the model has shown significant improvements.

Here we derive an EM-style update based upon mean field inference for learning the parameters of the dynamic tree model and apply it to a database of images of outdoor scenes where all of its parameters are learned. DTs are seen to offer significant improvement in performance over the fixed-architecture TSBN and in a coding comparison the DT achieves 0.294 bits per pixel (bpp) compression compared to 0.378 bpp for lossless JPEG on images of 7 colours.

References

  1. 1.
    Williams, C.K.I., Adams, N.J.: DTs: Dynamic Trees. In Kearns, M.J., Solla, S.A., Cohn, D.A., eds.: Advances in Neural Information Processing Systems 11. MIT Press (1999) 634–640Google Scholar
  2. 2.
    Bouman, C.A., Shapiro, M.: A Multliscale Random Field Model for Bayesian Image Segmentation. IEEE Transactions on Image Processing 3(2) (1994) 162–177CrossRefGoogle Scholar
  3. 3.
    Feng, X., Williams, C.K.I.: Training Bayesian Networks for Image Segmentation. In: Proceedings of SPIE. Volume 3457. (1998)Google Scholar
  4. 4.
    Luettgen, M.R., Willsky, A.S.: Likelihood Calculation for a Class of Multiscale Stochastic Models, with Application to Texture Discrimination. IEEE Transactions on Image Processing 4(2) (1995) 194–207CrossRefGoogle Scholar
  5. 5.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems. 2nd edn. Morgan Kaufman Publishers Inc., San Francisco, USA (1988)Google Scholar
  6. 6.
    Chou, P.A.: Recgonition of Equations Using a Two-Dimensional Stochastic Context-Free Grammar. Visual Communications and Image Processing IV 1199 (1989) 852–863CrossRefGoogle Scholar
  7. 7.
    Geman, S., Manbeck, K.: Experiments in Syntactic Recognition. Technical Report CICS-P-411, Division of Applied Mathematics, Brown University, Providence, RI 02912 USA (1994)Google Scholar
  8. 8.
    Adams, N.J., Storkey, A.J., Ghahramani, Z., Williams, C.K.I.: MFDTs: Mean Field Dynamic Trees. In Sanfeliu, A., Villanueva, J.J., Vanrell, A., Alquézar, R., Huang, T., Serra, J., eds.: Proceedings of 15th International Conference Pattern Recognition. Volume 3, Image speech and Signal Processing., IEEE Computer Society (2000) 151–154Google Scholar
  9. 9.
    Adams, N.J.: Dynamic Trees: A Hierarchical Probabilistic Approach to Image Modelling. PhD thesis, Institute for Adaptive and Neural Computation, Artificial Intelligence, Division of Informatics, University of Edinburgh, 5 Forrest Hill, Edinburgh, EH1 2QL, UK (2001) Available at: http://www.anc.ed.ac.uk/code/adams/.Google Scholar
  10. 10.
    Lauritzen, S.L.: Graphical Models. Oxford University Press (1996)Google Scholar
  11. 11.
    Geman, S., Geman, D.: Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images. In: IEEE Transactions on Pattern Analysis and Machine Intelligence. Volume 6, no. 6. (1984) 721–741zbMATHCrossRefGoogle Scholar
  12. 12.
    Chellappa, R., Chatterie, S.: Classification of Textures using Guassian Markov Random Fields. In: IEEE Trans. Accoust., Speech and Signal Processing. Volume 33. (1985) 959–963CrossRefGoogle Scholar
  13. 13.
    Crouse, M., Nowak, R., Baraniuk, R.: Wavelet-based statistical signal proccessing using hidden Markov models. IEEE Transactions on Signal Processing 46 (1998) 886–902MathSciNetCrossRefGoogle Scholar
  14. 14.
    De Bonet, J.S., Viola, P.A.: A Non-Parametric Multi-Scale Statistical Model for Natural Images. In Jordan, M.I., Kearns, M.J., Solla, S.A., eds.: Advances in Neural Information Processing Systems 10. MIT Press, Cambridge, MA (1998)Google Scholar
  15. 15.
    von der Malsburg, C.: The correlation theory of brain function. Internal Report 81-2, Max-Planck-Institut für Biophysikalische Chemie (1981) Reprinted in Models of Neural Networks, eds. K. Schulten and H.-J. van Hemmen, 2nd. ed, Springer, 1994.Google Scholar
  16. 16.
    von der Malsburg, C.: Dynamic link architecture. In Arbib, M.A., ed.: Handbook of Brain Theory and Neural Networks. MIT Press (1995) 329–331Google Scholar
  17. 17.
    Montanvert, A., Meer, P., Rosenfeld, A.: Hierarchical Image Analysis Using Irregular Tessellations. IEEE Trans. Pattern Analysis and Machine Intelligence 13(4) (1991) 307–316CrossRefGoogle Scholar
  18. 18.
    Hinton, G.E., Sallans, B., Ghahramani, Z.: A Hierarchical Community of Experts. In Bishop, C.M., ed.: Neural Networks and Machine Learning. Springer-Verlag New York inc. (1998)Google Scholar
  19. 19.
    Hinton, G., Ghahramani, Z., Teh, Y.W.: Learning to Parse Images. In Solla, S.A., Leen, T.K., Müller, K.R., eds.: Advances in Neural Information Processing Systems 12. MIT Press (2000) 463–469Google Scholar
  20. 20.
    Geiger, D., Heckerman, D.: Knowledge Representation and Inference in Similarity Networks and Bayesian Multinets. Artificial Intelligence 82 (1996) 45–74MathSciNetCrossRefGoogle Scholar
  21. 21.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An Introduction to Variational Methods For Graphical Models. In Jordan, M.I., ed.: Learning in Graphical Models. Kluwer Academic Publishers (1998) 105–161Google Scholar
  22. 22.
    Little, R.J.A., Rubin, D.B.: Statistical Analysis with Missing Data. John Wiley, New York, USA (1987)zbMATHGoogle Scholar
  23. 23.
    Feng, X., Williams, C.K.I., Felderhof, S.N.: Combining Belief Networks and Neural Networks for Scene Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence (2001) Accepted for publication.Google Scholar
  24. 24.
    Adams, N.J., Williams, C.K.I., Storkey, A.J.: Comparing Mean Field and Exact EM in Tree Structured Belief Networks. In: Fourth International ICSC Symposium on Soft Computing and Intelligent Systems for Industry. ICSC-NAISO Adademic Press (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Nicholas J. Adams
    • 1
  • Christopher K. I. Williams
    • 1
  1. 1.Institute for Adaptive and Neural ComputationUniversity of EdinburghEdinburghUK

Personalised recommendations