Advertisement

Object tracking method based on particle filter of adaptive patches combined with multi-features fusion

  • Meng Cai-xia
  • Zhang Xin-yan
Article
  • 33 Downloads

Abstract

Object tracking has been one of the most important and active research areas in the field of computer vision. In this paper, we address the problem of object tracking under complex conditions in a video, which propose a object tracking method based on particle filter of adaptive patches combined color histograms with Histogram of Oriented Gradient(HOG). The adaptive patch is performed by horizontal and vertical projection based on object gray levels, which can improve the patch adaptability to the object appearance diversity and the accuracy of object tracking under occlusion conditions. The fusion of color histograms and HOG features is adopted to describe each sub-patch, which not only solves the tracking divergence problem of similar objects, but also reduces the effect of local deformation. In addition, the weighted Bhattacharyya coefficient is introduced to calculate the sub-patch matching degree of the particle, and the particle sub-patch weight will be adjusted by integrating the particle space information, and the feature model is also updated in time to achieve robust object tracking. Many simulation experiments show that our proposed algorithm achieves more favorable performance than these existing state-of-the-art algorithms in handing various challenging videos, especially occlusion and shape deformation.

Keywords

Object tracking Particle filter Color histogram Histogram of oriented gradient Adaptability Projection 

Notes

Acknowledgements

1. The Scientific and Technological Research Program of Henan Province. No.172102210441.

2. Key Scientific Research projects in Henan Colleges and Universities.No.18B520034.

3. The Ministry of Public Security Technical Research Plan under grant. No.2016JSYJB38.

References

  1. 1.
    Babenko B, Yang M, Belongie S (2013) Visual tracking with online multiple instance learning[C]". IEEE Conf Comput Vision Pattern Recogn: 983–900Google Scholar
  2. 2.
    Bae SH, Yoon KJ (2018) Confidence-based data association and discriminative deep appearance learning for robust online multi-object tracking[J]. IEEE Trans Pattern Anal Mach Intell 40(3):595–610CrossRefGoogle Scholar
  3. 3.
    Bhattacharyya A (1946) On a measure of divergence between two multinomial populations[J]. Sankhya: Indian J Stat 17(4):401–406MathSciNetzbMATHGoogle Scholar
  4. 4.
    Danescu R, Oniga F, Nedevschi S et al. (2009) Tracking multiple objects using particle filters and digital elevation maps[C]//. Intell Vehic Sym 2009 IEEE. IEEE: 88–93Google Scholar
  5. 5.
    De Ath G, Everson R (2018) Visual object tracking: The initialisation problem[J] arXiv preprint arXiv:1805.01146Google Scholar
  6. 6.
    Du B, Sun Y, Cai S et al (2018) Object tracking in satellite videos by fusing the kernel correlation filter and the three-frame-difference algorithm[J]. IEEE Geosci Remote Sens Lett 15(2):168–172Google Scholar
  7. 7.
    Habbachi S, Sayadi M, Rezzoug N (2018) Partical filtering for orientation determining using inertial sensors IMU[C]//. Adv Technol Signal Image Process (ATSIP), 2018 4th Int Conf. IEEE: 1–5Google Scholar
  8. 8.
    Hu W, Li X, Luo W et al (2012) Single and multiple object tracking using log-Euclidean Riemannian subspace and block-division appearance model[J]. IEEE Trans Pattern Anal Mach Intell 34(12):2420–2440CrossRefGoogle Scholar
  9. 9.
    Hu H, Ma B, Shen J et al (2018) Manifold regularized correlation object tracking[J]. IEEE Trans Neural Netw Learn Syst 29(5):1786–1795CrossRefGoogle Scholar
  10. 10.
    Jia X, Lu H, Yang MH (2012) Visual tracking via adaptive structural local sparse appearance model. Proc IEEE Conf Comput Vision Pattern Recogn (CVPR): 1822–1829Google Scholar
  11. 11.
    Li YZ, Lu ZY, Li J (2012) Robust video object tracking algorithm based on multi-feature fusion[J]. J Xid Univ 39(4):1–6Google Scholar
  12. 12.
    Long C, Baolong G, Wei S (2011) Multi-object tracking algorithm based on FCM and particle filter [J]. Chin J Sci Instrum 11:021Google Scholar
  13. 13.
    Wen L, Cai Z, Lei Z, Yi D, Li S (2012) Online spatio-temporal structure context learning for visual tracking. In: ECCV. pp. 716–729Google Scholar
  14. 14.
    Zhang KH, Song HH (2013) Real-time visual tracking via online weighted multiple instance learning[J]. Pattern Recogn 46(1):397–411MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Zhang T, Ghanem B, Liu S, Ahuja N (2012) Visual tracking via discriminant sparse similarity map[J]. IEEE Conf Comput Vision Pattern Recogn :470–484Google Scholar
  16. 16.
    Zhang KH, Zhang L, Yang MH et al (2013) Robust object tracking via active feature selection[J]. IEEE Trans Circ Syst Video Technol 23(11):1957–1967CrossRefGoogle Scholar
  17. 17.
    Zhang KH, Zhang L, Yang MH (2013) Real-time object tracking via online discriminative feature selection[J]. IEEE Trans Image Process 22(12):4664–4677MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Zhang KH, Zhang L, Yank MH (2014) Fast compressive tracking[J]. IEEE Trans Pattern Anal Mach Intell 36(10):2002–2015CrossRefGoogle Scholar
  19. 19.
    Zhong W, Lu H, Yang MH (2012) “robust object tracking via sparsity-based collaborative model. Proc IEEE Conf Comput Vision Pattern Recogn (CVPR): 1838–1845Google Scholar
  20. 20.
    Zhuang B, Lu H, Xiao Z, Wang D (2014) Low-rank sparse learning for robust visual tracking[J]. IEEE Trans Image Process 23(4):1872–1881MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Image and Network InvestigationRailway Police CollegeZhengzhouChina
  2. 2.Department of Computer and Information EngineeringLuoyang Institute of Science and TechnologyLuoyang CityChina

Personalised recommendations