Advertisement

MS3D: Mean-Shift Object Tracking Boosted by Joint Back Projection of Color and Depth

  • Yongheng ZhaoEmail author
  • Emanuele Menegatti
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 867)

Abstract

In this paper, we present MS3D tracker, which extends the mean-shift tracking algorithm in several ways when RGB-D data is available. We fuse color and depth distribution efficiently in the mean-shift tracking scheme. In addition, in order to improve the robustness of the description of the object to be tracked, we further process the pixels in the rectangular region of interest (ROI) returned by mean-shift. We apply depth distribution analysis to pixels of the ROI in order to separate background pixels from pixels belonging to the object to be tracked (i.e. the target region). Then, we use the color histogram of the target region and its surroundings to create a discriminative color model, which has the capability to distinguish the object from background. The proposed algorithm is evaluated on the RGB-D tracking dataset proposed by [1]. It ranked in the first position and it runs in real-time showing both accuracy and robustness in the challenge sequences of background clutter, occlusion, scale variation and shape deformation.

Keywords

Mean-shift RGB-D object tracking Fusion of color and depth 

References

  1. 1.
    Xiao, J., Stolkin, R., Gao, Y., Leonardis, A.: Robust fusion of color and depth data for RGB-D target tracking using adaptive range-invariant depth models and spatio-temporal consistency constraints. IEEE Trans. Cybern. (2017)Google Scholar
  2. 2.
    Wu, Y., Lim, J., Yang, M.-H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)CrossRefGoogle Scholar
  3. 3.
    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)CrossRefGoogle Scholar
  4. 4.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)CrossRefGoogle Scholar
  5. 5.
    Comaniciu, D., Ramesh, V., Meer, P.: Kernel-based object tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 564–577 (2003)CrossRefGoogle Scholar
  6. 6.
    Hare, S., Golodetz, S., Saffari, A., Vineet, V., Cheng, M.-M., Hicks, S.L., Torr, P.H.: Struck: structured output tracking with kernels. IEEE Trans. Pattern Anal. Mach. Intell. 38(10), 2096–2109 (2016)CrossRefGoogle Scholar
  7. 7.
    Possegger, H., Mauthner, T., Bischof, H.: In defense of color-based model-free tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2113–2120 (2015)Google Scholar
  8. 8.
    Vojir, T., Noskova, J., Matas, J.: Robust scale-adaptive mean-shift for tracking. Pattern Recognit. Lett. 49, 250–258 (2014)CrossRefGoogle Scholar
  9. 9.
    Kristan, M., Pflugfelder, R., Leonardis, A., Matas, J., Čehovin, L., Nebehay, G., Vojir, T., Fernandez, G., Lukežič, A., Dimitriev, A., et al.: The visual object tracking VOT2014 challenge results (2014)Google Scholar
  10. 10.
    Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.: Staple: complementary learners for real-time tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)Google Scholar
  11. 11.
    Stolkin, R., Florescu, I., Kamberov, G.: An adaptive background model for CAMSHIFT tracking with a moving camera. In: Advances In Pattern Recognition, pp. 147–151. World Scientific (2007)Google Scholar
  12. 12.
    Ning, J., Zhang, L., Zhang, D., Wu, C.: Robust object tracking using joint color-texture histogram. Int. J. Pattern Recognit. Artif. Intell. 23(07), 1245–1263 (2009)CrossRefGoogle Scholar
  13. 13.
    Ning, J., Zhang, L., Zhang, D.: Robust mean-shift tracking with corrected background-weighted histogram. IET Comput. Vis. 6(1), 62–69 (2012)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Hu, Q., Guo, Y., Chen, Y., Xiao, J., An, W.: Correlation filter tracking: beyond an open-loop systemGoogle Scholar
  15. 15.
    Fan, H., Ling, H.: Parallel tracking and verifying: a framework for real-time and high accuracy visual tracking. arXiv preprint arXiv:1708.00153 (2017)
  16. 16.
    Zhao, Y., Carraro, M., Munaro, M., Menegatti, E.: Robust multiple object tracking in RGB-D camera networks. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 6625–6632. IEEE (2017)Google Scholar
  17. 17.
    Song, S., Xiao, J.: Tracking revisited using RGBD camera: Unified benchmark and baselines. In: IEEE International Conference on Computer Vision (ICCV), pp. 233–240. IEEE (2013)Google Scholar
  18. 18.
    Hannuna, S., Camplani, M., Hall, J., Mirmehdi, M., Damen, D., Burghardt, T., Paiement, A., Tao, L.: DS-KCF: a real-time tracker for RGB-D data. J. Real Time Image Process. 1–20 (2016)Google Scholar
  19. 19.
    Meshgi, K., Maeda, S.-I., Oba, S., Skibbe, H., Li, Y.-Z., Ishii, S.: An occlusion-aware particle filter tracker to handle complex and persistent occlusions. Comput. Vis. Image Underst. 150, 81–94 (2016)CrossRefGoogle Scholar
  20. 20.
    Hidayatullah, P., Konik, H.: CAMSHIFT improvement on multi-hue object and multi-object tracking. In: 3rd European Workshop on Visual Information Processing (EUVIP), pp. 143–148. IEEE (2011)Google Scholar
  21. 21.
    Bradski, G.R.: Real time face and object tracking as a component of a perceptual user interface. In: Proceedings of the Fourth IEEE Workshop on Applications of Computer Vision, WACV 1998, pp. 214–219. IEEE (1998)Google Scholar
  22. 22.
    Camplani, M., Hannuna, S., Mirmehdi, M., Damen, D., Paiement, A., Tao, L., Burghardt, T.: Real-time RGB-D tracking with depth scaling kernelised correlation filters and occlusion handling (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Information Engineering (DEI)University of PadovaPadovaItaly

Personalised recommendations