Advertisement

Joint Estimation of Motion, Structure and Geometry from Stereo Sequences

  • Levi Valgaerts
  • Andrés Bruhn
  • Henning Zimmer
  • Joachim Weickert
  • Carsten Stoll
  • Christian Theobalt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6314)

Abstract

We present a novel variational method for the simultaneous estimation of dense scene flow and structure from stereo sequences. In contrast to existing approaches that rely on a fully calibrated camera setup, we assume that only the intrinsic camera parameters are known. To couple the estimation of motion, structure and geometry, we propose a joint energy functional that integrates spatial and temporal information from two subsequent image pairs subject to an unknown stereo setup. We further introduce a normalisation of image and stereo constraints such that deviations from model assumptions can be interpreted in a geometrical way. Finally, we suggest a separate discontinuity-preserving regularisation to improve the accuracy. Experiments on calibrated and uncalibrated data demonstrate the excellent performance of our approach. We even outperform recent techniques for the rectified case that make explicit use of the simplified geometry.

Keywords

Fundamental Matrix Joint Estimation Epipolar Line Smoothness Term Scene Structure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Supplementary material

978-3-642-15561-1_41_MOESM1_ESM.avi (9.6 mb)
Electronic Supplementary Material (9,802 KB)

References

  1. 1.
    Brox, T., Bregler, C., Malik, J.: Large displacement optical flow. In: Proc. 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 41–48. IEEE Computer Society Press, Miami (2009)Google Scholar
  2. 2.
    Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optic flow estimation based on a theory for warping. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004)Google Scholar
  3. 3.
    Bruhn, A., Weickert, J.: Towards ultimate motion estimation: Combining highest accuracy with real-time performance. In: Proc. Tenth International Conference on Computer Vision, vol. 1, pp. 749–755. IEEE Computer Society Press, Beijing (2005)Google Scholar
  4. 4.
    Carceroni, R.L., Kutulakos, K.N.: Multi-view scene capture by surfel sampling: From video streams to non-rigid 3d motion, shape and reflectance. International Journal of Computer Vision 49(2-3), 175–214 (2002)zbMATHCrossRefGoogle Scholar
  5. 5.
    Coons, S.A.: Surfaces for computer aided design of space forms. Tech. Rep. MIT/LCS/TR-41, Massachusetts Institute of Technology, Cambridge, MA (1967)Google Scholar
  6. 6.
    Courchay, J., Pons, J.P., Monasse, P., Keriven, R.: Dense and accurate spatio-temporal multi-view stereovision. In: Zha, H., Taniguchi, R.-i., Maybank, S. (eds.) Computer Vision – ACCV 2009. LNCS, vol. 5995, pp. 11–22. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  7. 7.
    Faugeras, O., Luong, Q.T., Papadopoulo, T.: The Geometry of Multiple Images. MIT Press, Cambridge (2001)zbMATHGoogle Scholar
  8. 8.
    Felzenszwalb, P.F., Huttenlocher, D.P.: Efficient belief propagation for early vision. International Journal of Computer Vision 40(1), 41–54 (2006)CrossRefGoogle Scholar
  9. 9.
    Furukawa, Y., Ponce, J.: Dense 3d motion capture from synchronized video streams. In: Proc. 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society Press, Anchorage (2008)Google Scholar
  10. 10.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2000)zbMATHGoogle Scholar
  11. 11.
    Horn, B., Schunck, B.: Determining optical flow. Artificial Intelligence 17, 185–203 (1981)CrossRefGoogle Scholar
  12. 12.
    Huguet, F., Devernay, F.: A variational method for scene flow estimation from stereo sequences. In: Proc. Eleventh International Conference on Computer Vision. IEEE Computer Society Press, Rio de Janeiro (2007)Google Scholar
  13. 13.
    Isard, M., MacCormick, J.: Dense motion and disparity estimation via loopy belief propagation. In: Narayanan, P.J., Nayar, S.K., Shum, H.-Y. (eds.) ACCV 2006. LNCS, vol. 3852, pp. 32–41. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  14. 14.
    Longuet-Higgins, H.C.: A computer algorithm for reconstructing a scene from two projections. Nature 293, 133–135 (1981)CrossRefGoogle Scholar
  15. 15.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  16. 16.
    Luong, Q.T., Faugeras, O.D.: The fundamental matrix: theory, algorithms, and stability analysis. International Journal of Computer Vision 17(1), 43–75 (1996)CrossRefGoogle Scholar
  17. 17.
    Mémin, E., Pérez, P.: Dense estimation and object-based segmentation of the optical flow with robust techniques. IEEE Transactions on Image Processing 7(5), 703–719 (1998)CrossRefGoogle Scholar
  18. 18.
    Min, D.B., Sohn, K.: Edge-preserving simultaneous joint motion-disparity estimation. In: Proc. 18th International Conference on Pattern Recognition, Hong Kong, pp. 74–77 (2006)Google Scholar
  19. 19.
    Patras, I., Alvertos, N., Tziritas, G.: Joint disparity and motion field estimation in steroscopic image sequences. In: Proc. 13th International Conference on Pattern Recognition, Vienna, Austria, vol. 1, pp. 359–362 (1996)Google Scholar
  20. 20.
    Pons, J.P., Keriven, R., Faugeras, O.D.: Multi-view stereo reconstruction and scene flow estimation with a global image-based matching score. International Journal of Computer Vision 72(2), 179–193 (2007)CrossRefGoogle Scholar
  21. 21.
    Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Physica D 60, 259–268 (1992)zbMATHCrossRefGoogle Scholar
  22. 22.
    Valgaerts, L., Bruhn, A., Weickert, J.: A variational model for the joint recovery of the fundamental matrix and the optical flow. In: Rigoll, G. (ed.) DAGM 2008. LNCS, vol. 5096, pp. 314–324. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  23. 23.
    Vedula, S., Baker, S., Rander, P., Collins, R.T., Kanade, T.: Three-dimensional scene flow. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(3), 475–480 (2005)CrossRefGoogle Scholar
  24. 24.
    Wedel, A., Rabe, C., Vaudrey, T., Brox, T., Franke, U., Cremers, D.: Efficient dense scene flow from sparse or dense stereo data. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part I. LNCS, vol. 5302, pp. 739–751. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  25. 25.
    Wedel, A., Vaudrey, T., Meissner, A., Rabe, C., Brox, T., Franke, U., Cremers, D.: An evaluation approach for scene flow with decoupled motion and position. In: Cremers, D., Rosenhahn, B., Yuille, A.L., Schmidt, F.R. (eds.) Dagstuhl Seminar. LNCS, vol. 5604, pp. 46–69. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  26. 26.
    Zhang, Y., Kambhamettu, C.: On 3d scene flow and structure estimation. In: Proc. 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 778–785. IEEE Computer Society Press, Kauai (2001)Google Scholar
  27. 27.
    Zimmer, H., Bruhn, A., Weickert, J., Valgaerts, L., Salgado, A., Rosenhahn, B., Seidel, H.P.: Complementary optic flow. In: Cremers, D., Boykov, Y., Blake, A., Schmidt, F.R. (eds.) EMMCVPR 2009. LNCS, vol. 5681, pp. 207–220. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Levi Valgaerts
    • 1
  • Andrés Bruhn
    • 1
  • Henning Zimmer
    • 1
  • Joachim Weickert
    • 1
  • Carsten Stoll
    • 2
  • Christian Theobalt
    • 2
  1. 1.Mathematical Image Analysis GroupSaarland UniversitySaarbrückenGermany
  2. 2.Max-Planck Institute for InformaticsSaarbrückenGermany

Personalised recommendations