Combining Stereo and Time-of-Flight Images with Application to Automatic Plant Phenotyping

  • Yu Song
  • Chris A. Glasbey
  • Gerie W. A. M. van der Heijden
  • Gerrit Polder
  • J. Anja Dieleman
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6688)


This paper shows how stereo and Time-of-Flight (ToF) images can be combined to estimate dense depth maps in order to automate plant phenotyping. We focus on some challenging plant images captured in a glasshouse environment, and show that even the state-of-the-art stereo methods produce unsatisfactory results. By developing a geometric approach which transforms depth information in a ToF image to a localised search range for dense stereo, a global optimisation strategy is adopted for producing smooth and discontinuity-preserving results. Since pixel-by-pixel depth data are unavailable for our images and many other applications, a quantitative method accounting for the surface smoothness and the edge sharpness to evaluate estimation results is proposed. We compare our method with and without ToF against other state-of-the-art stereo methods, and demonstrate that combining stereo and ToF images gives superior results.


Colour Image Depth Data Stereo Image Stereo Match Depth Edge 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Beder, C., Bartczak, B., Koch, R.: A combined approach for estimating patchlets from PMD depth images and stereo intensity images. In: Hamprecht, F.A., Schnörr, C., Jähne, B. (eds.) DAGM 2007. LNCS, vol. 4713, pp. 11–20. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  2. 2.
    Beder, C., Koch, R.: Calibration of focal length and 3d pose based on the reflectance and depth image of a planar object. International Journal of Intelligent Systems Technologies and Applications 5, 285–294 (2008)CrossRefGoogle Scholar
  3. 3.
    Boykov, Y., Veksler, O., Zabih, R.: Fast approximate energy minimization via graph cuts. IEEE Transactions on Pattern Analysis and Machine Intelligence 23, 1222–1239 (2001)CrossRefGoogle Scholar
  4. 4.
    Diebel, J., Thrun, S.: An application of markov random fields to range sensing. In: Proceedings of Conference on Neural Information Processing Systems (NIPS). MIT Press, Cambridge (2005)Google Scholar
  5. 5.
    Gudmundsson, S.A., Aanæs, H., Larsen, R.: Environmental effects on measurement uncertainties of time-of-flight cameras. In: International Symposium on Signals Circuits and Systems (2007)Google Scholar
  6. 6.
    Gudmundsson, S.A., Aanaes, H., Larsen, R.: Fusion of stereo vision and time-of-flight imaging for improved 3d estimation. International Journal of Intelligent Systems Technologies and Applications 5(3/4), 425–433 (2008)CrossRefGoogle Scholar
  7. 7.
    Hahne, U., Alexa, M.: Combining time-of-flight depth and stereo images without accurate extrinsic calibration. International Journal of Intelligent Systems Technologies and Applications 5(3/4), 325–333 (2008)CrossRefGoogle Scholar
  8. 8.
    Kim, Y., Theobalt, C., Diebel, J., Kosecka, J., Micusik, B., Thrun, S.: Multi-view image and tof sensor fusion for dense 3d reconstruction. In: Proceedings of the 3DIM 2009 (2009)Google Scholar
  9. 9.
    Kolb, A., Barth, E., Koch, R., Larsen, R.: Time-of-Flight Sensors in Computer Graphics. In: Pauly, M., Greiner, G. (eds.) Eurographics 2009 - State of the Art Reports Eurographics, pp. 119–134. (2009)Google Scholar
  10. 10.
    Lindner, M., Lambers, M., Kolb, A.: Sub-pixel data fusion and edge-enhanced distance refinement for 2d/3d images. International Journal of Intelligent Systems Technologies and Applications 5, 344–354 (2008)CrossRefGoogle Scholar
  11. 11.
    Liu, C., Yuen, J., Torralba, A.: Sift flow: Dense correspondence across scenes and its applications. IEEE Transactions on Pattern Analysis and Machine Intelligence (2010)Google Scholar
  12. 12.
    Lowe, D.: Object recognition from local scale-invariant features. In: Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 2, pp. 1150–1157 (1999)Google Scholar
  13. 13.
    Ogale, A.S., Aloimonos, Y.: Shape and the stereo correspondence problem. International Journal of Computer Vision 65, 147–162 (2005)CrossRefGoogle Scholar
  14. 14.
    Polder, G., van der Heijden, G.W.A.M., Glasbey, C.A., Song, Y., Dieleman, J.A.: Spy-See - Advanced vision system for phenotyping in greenhouses. In: Proceedings of the MINET Conference: Measurement, Sensation and Cognition. National Physical Laboratory, pp. 115–117 (2009)Google Scholar
  15. 15.
    Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision 47(1-3), 7–42 (2002)CrossRefzbMATHGoogle Scholar
  16. 16.
    Schuon, S., Theobalt, C., Davis, J., Thrun, S.: Lidarboost: Depth superresolution for tof 3d shape scanning. In: Proceedings of the IEEE CVPR 2009 (2009)Google Scholar
  17. 17.
    Tola, E., Lepetit, V., Fua, P.: Daisy: an efficient dense descriptor applied to wide baseline stereo. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(5), 815–830 (2010)CrossRefGoogle Scholar
  18. 18.
    Zhu, J., Wang, L., Yang, R., Davis, J.: Fusion of time-of-flight depth and stereo for high accuracy depth maps. In: Proceedings of the IEEE CVPR 2008 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Yu Song
    • 1
  • Chris A. Glasbey
    • 1
  • Gerie W. A. M. van der Heijden
    • 2
  • Gerrit Polder
    • 2
  • J. Anja Dieleman
    • 3
  1. 1.Biomathematics and Statistics ScotlandEdinburghUK
  2. 2.Biometris, Wageningen URWageningenNetherlands
  3. 3.Wageningen UR Greenhouse HorticultureWageningenNetherlands

Personalised recommendations