Advertisement

Closed-Loop Adaptation for Robust Tracking

  • Jialue Fan
  • Xiaohui Shen
  • Ying Wu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6311)

Abstract

Model updating is a critical problem in tracking. Inaccurate extraction of the foreground and background information in model adaptation would cause the model to drift and degrade the tracking performance. The most direct but yet difficult solution to the drift problem is to obtain accurate boundaries of the target. We approach such a solution by proposing a novel closed-loop model adaptation framework based on the combination of matting and tracking. In our framework, the scribbles for matting are all automatically generated, which makes matting applicable in a tracking system. Meanwhile, accurate boundaries of the target can be obtained from matting results even when the target has large deformation. An effective model is further constructed and successfully updated based on such accurate boundaries. Extensive experiments show that our closed-loop adaptation scheme largely avoids model drift and significantly outperforms other discriminative tracking models as well as video matting approaches.

Keywords

Current Frame Tracking Result Salient Point Robust Tracking Accurate Boundary 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Collins, R., Liu, Y., Leordeanu, M.: On-line selection of discriminative tracking features. IEEE Trans. on PAMI (2005)Google Scholar
  2. 2.
    Nguyen, H., Smeulders, A.: Robust tracking using foreground-background texture discrimination. IJCV, 277–293 (2006)Google Scholar
  3. 3.
    Babenko, B., Yang, M., Belongie, S.: Visual tracking with online multiple instance learning. In: CVPR (2009)Google Scholar
  4. 4.
    Pellegrini, S., Ess, A., Schindler, K., Van Gool, L.: You’ll never walk alone: modeling social behavior for multi-target tracking. In: ICCV (2009)Google Scholar
  5. 5.
    Yu, T., Wu, Y.: Collaborative tracking of multiple targets. In: CVPR (2004)Google Scholar
  6. 6.
    Matthews, I., Ishikawa, T., Baker, S.: The template update problem. IEEE Trans. on PAMI, 810–815 (2006)Google Scholar
  7. 7.
    Levin, A., Lischinski, D., Weiss, Y.: A closed-form solution to natural image matting. IEEE trans. on PAMI, 228–242 (2008)Google Scholar
  8. 8.
    Wang, J., Cohen, M.: Image and video matting: a survey. Foundations and Trends in Computer Graphics and Vision, 97–175 (2007)Google Scholar
  9. 9.
    Chuang, Y.Y., Agarwala, A., Curless, B., Salesin, D., Szeliski, R.: Video matting of complex scenes. In: SIGGRAPH (2002)Google Scholar
  10. 10.
    Bai, X., Wang, J., Simons, D., Sapiro, G.: Video snapCut: robust video object cutout using localized classifiers. In: SIGGRAPH (2009)Google Scholar
  11. 11.
    Lowe, D.: Distinctive image features from scale-invariant keypoints. In: IJCV (2004)Google Scholar
  12. 12.
    Kuhl, F.P., Giardina, C.R.: Elliptic fourier features of a closed contour. Computer Graphics and Image Processing (1982)Google Scholar
  13. 13.
    Hager, G., Belhumeur, P.: Real-time tracking of image regions with changes in geometry and illumination. In: CVPR (1996)Google Scholar
  14. 14.
    Zhou, Y., Tao, H.: A background layer model for object tracking through occlusion. In: ICCV (2003)Google Scholar
  15. 15.
    Yin, Z., Collins, R.: Shape constrained figure-ground segmentation and tracking. In: CVPR (2009)Google Scholar
  16. 16.
    Ren, X., Malik, J.: Tracking as repeated figure/ground segmentation. In: CVPR (2007)Google Scholar
  17. 17.
    He, K., Sun, J., Tang, X.: Fast matting using large kernel matting laplacian matrices. In: CVPR (2010)Google Scholar
  18. 18.
    Grabner, H., Leistner, C., Bischof, H.: Semi-supervised on-line boosting for robust tracking. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part I. LNCS, vol. 5302, pp. 234–247. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  19. 19.
    Yang, M., Hua, G., Wu, Y.: Context-aware visual tracking. IEEE Trans. on PAMI, 1195–1209 (2009)Google Scholar
  20. 20.
    Wu, Y., Fan, J.: Contextual flow. In: CVPR (2009)Google Scholar
  21. 21.
    Avidan, S.: Ensemble tracking. In: CVPR (2005)Google Scholar
  22. 22.
    Comaniciu, D., Ramesh, V., Meer, P.: Real-time tracking of non-rigid objects using mean shift. In: CVPR (2000)Google Scholar
  23. 23.
    Bregler, C., Malik, J.: Tracking people with twists and exponential maps. In: CVPR (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Jialue Fan
    • 1
  • Xiaohui Shen
    • 1
  • Ying Wu
    • 1
  1. 1.Northwestern UniversityEvanston

Personalised recommendations