Advertisement

Optical Flow with Geometric Occlusion Estimation and Fusion of Multiple Frames

  • Ryan Kennedy
  • Camillo J. Taylor
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8932)

Abstract

Optical flow research has made significant progress in recent years and it can now be computed efficiently and accurately for many images. However, complex motions, large displacements, and difficult imaging conditions are still problematic. In this paper, we present a framework for estimating optical flow which leads to improvements on these difficult cases by 1) estimating occlusions and 2) using additional temporal information. First, we divide the image into discrete triangles and show how this allows for occluded regions to be naturally estimated and directly incorporated into the optimization algorithm. We additionally propose a novel method of dealing with temporal information in image sequences by using “inertial estimates” of the flow. These estimates are combined using a classifier-based fusion scheme, which significantly improves results. These contributions are evaluated on three different optical flow datasets, and we achieve state-of-the-art results on MPI-Sintel.

Keywords

Cost Function Random Forest Optical Flow Quadrature Point Cholesky Factorization 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Horn, B.K., Schunck, B.G.: Determining optical flow. Artificial Intelligence 17(1), 185–203 (1981)CrossRefGoogle Scholar
  2. 2.
    Black, M.J., Anandan, P.: The robust estimation of multiple motions: Parametric and piecewise-smooth flow fields. CVIU 63(1), 75–104 (1996)Google Scholar
  3. 3.
    Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: CVPR (2010)Google Scholar
  4. 4.
    Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M.J., Szeliski, R.: A database and evaluation methodology for optical flow. IJCV 92(1), 1–31 (2011)CrossRefGoogle Scholar
  5. 5.
    Butler, D.J., Wulff, J., Stanley, G.B., Black, M.J.: A naturalistic open source movie for optical flow evaluation. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part VI. LNCS, vol. 7577, pp. 611–625. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  6. 6.
    Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? the kitti vision benchmark suite. In: CVPR (2012)Google Scholar
  7. 7.
    Lempitsky, V., Roth, S., Rother, C.: Fusionflow: Discrete-continuous optimization for optical flow estimation. In: CVPR, pp. 1–8. IEEE (2008)Google Scholar
  8. 8.
    Xu, L., Chen, J., Jia, J.: A segmentation based variational model for accurate optical flow estimation. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part I. LNCS, vol. 5302, pp. 671–684. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Strecha, C., Fransens, R., Van Gool, L.: A probabilistic approach to large displacement optical flow and occlusion detection. In: Comaniciu, D., Mester, R., Kanatani, K., Suter, D. (eds.) SMVP 2004. LNCS, vol. 3247, pp. 71–82. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. 10.
    Sun, D., Sudderth, E.B., Black, M.J.: Layered image motion with explicit occlusions, temporal consistency, and depth ordering. In: NIPS, pp. 2226–2234 (2010)Google Scholar
  11. 11.
    Glocker, B., Heibel, T.H., Navab, N., Kohli, P., Rother, C.: TriangleFlow: Optical flow with triangulation-based higher-order likelihoods. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part III. LNCS, vol. 6313, pp. 272–285. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  12. 12.
    Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. PAMI 34(9), 1744–1757 (2012)CrossRefGoogle Scholar
  13. 13.
    Kim, T.H., Lee, H.S., Lee, K.M.: Optical flow via locally adaptive fusion of complementary data costs. In: ICCV (2013)Google Scholar
  14. 14.
    Sun, D., Liu, C., Pfister, H.: Local layering for joint motion estimation and occlusion detection. In: CVPR (2014)Google Scholar
  15. 15.
    Volz, S., Bruhn, A., Valgaerts, L., Zimmer, H.: Modeling temporal coherence for optical flow. In: ICCV, pp. 1116–1123. IEEE (2011)Google Scholar
  16. 16.
    Sun, D., Wulff, J., Sudderth, E.B., Pfister, H., Black, M.J.: A fully-connected layered model of foreground and background flow. In: CVPR (2013)Google Scholar
  17. 17.
    Mac Aodha, O., Humayun, A., Pollefeys, M., Brostow, G.J.: Learning a confidence measure for optical flow. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(5), 1107–1120 (2013)CrossRefGoogle Scholar
  18. 18.
    Jung, H.Y., Lee, K.M., Lee, S.U.: Toward global minimum through combined local minima. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part IV. LNCS, vol. 5305, pp. 298–311. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  19. 19.
    Chang, H.S., Wang, Y.C.F.: Superpixel-based large displacement optical flow. In: ICIP, pp. 3835–3839 (2013)Google Scholar
  20. 20.
    Negahdaripour, S.: Revised definition of optical flow: Integration of radiometric and geometric cues for dynamic scene analysis. PAMI 20(9), 961–979 (1998)CrossRefGoogle Scholar
  21. 21.
    Donoser, M., Schmalstieg, D.: Discrete-continuous gradient orientation estimation for faster image segmentation. In: CVPR (2014)Google Scholar
  22. 22.
    Cowper, G.: Gaussian quadrature formulas for triangles. International Journal for Numerical Methods in Engineering 7(3), 405–408 (1973)CrossRefzbMATHGoogle Scholar
  23. 23.
    Sun, D., Roth, S., Lewis, J.P., Black, M.J.: Learning optical flow. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part III. LNCS, vol. 5304, pp. 83–97. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  24. 24.
    Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. PAMI 33(3), 500–513 (2011)CrossRefGoogle Scholar
  25. 25.
    Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: CVPR, vol. 1, pp. 886–893. IEEE (2005)Google Scholar
  26. 26.
    Muja, M., Lowe, D.G.: Fast approximate nearest neighbors with automatic algorithm configuration. In: VISAPP, pp. 331–340 (2009)Google Scholar
  27. 27.
    Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: Deepflow: Large displacement optical flow with deep matching. In: ICCV (2013)Google Scholar
  28. 28.
    Byrne, J., Shi, J.: Nested shape descriptors. In: ICCV, pp. 1201–1208. IEEE (2013)Google Scholar
  29. 29.
    Werlberger, M., Pock, T., Bischof, H.: Motion estimation with non-local total variation regularization. In: CVPR, pp. 2464–2471. IEEE (2010)Google Scholar
  30. 30.
    Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  31. 31.
    Amestoy, P.R., Davis, T.A., Duff, I.S.: Algorithm 837: Amd, an approximate minimum degree ordering algorithm. ACM Trans. Math. Softw. 30(3), 381–388 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  32. 32.
    Fortun, D., Bouthemy, P., Kervrann, C.: Aggregation of local parametric candidates with exemplar-based occlusion handling for optical flow. arXiv preprint arXiv:1407.5759v1Google Scholar
  33. 33.
    Yamaguchi, K., McAllester, D., Urtasun, R.: Robust monocular epipolar flow estimation. In: CVPR, pp. 1862–1869. IEEE (2013)Google Scholar
  34. 34.
    Zach, C., Pock, T., Bischof, H.: A duality based approach for realtime TV-L 1 optical flow. In: Hamprecht, F.A., Schnörr, C., Jähne, B. (eds.) DAGM 2007. LNCS, vol. 4713, pp. 214–223. Springer, Heidelberg (2007)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Ryan Kennedy
    • 1
  • Camillo J. Taylor
    • 1
  1. 1.Department of Computer and Information ScienceUniversity of PennsylvaniaUSA

Personalised recommendations