Skip to main content

A Fusion Approach to Grayscale-Thermal Tracking with Cross-Modal Sparse Representation

  • Conference paper
  • First Online:
Image and Graphics Technologies and Applications (IGTA 2018)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 875))

Included in the following conference series:

Abstract

Grayscale-thermal tracking receives much attention recently due to the complementary benefits of the visible and thermal infrared modalities in over- coming the imaging limitations of individual source. This paper investigates how to perform effective fusion of the grayscale and thermal information for robust object tracking. We propose a novel fusion approach based on the cross-modal sparse representation in the Bayesian filtering framework. First, to exploit the interdependence of different modalities, we take both the intra- and inter-modality constraints into account in the sparse representation, i.e., cross-modal sparse rep- resentation. Moreover, we introduce the modality weights in our model to achieve adaptive fusion. Second, unlike conventional methods, we employ the reconstruction residues and coefficients together to define the likelihood probability for each candidate sample generated by the motion model. Finally, the object is located by finding the candidate sample with the maximum likelihood probability. Experimental results on the public benchmark dataset suggest that the proposed approach performs favourably against the state-of-the-art grayscale-thermal trackers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bunyak, F., Palaniappan, K., Nath, S.K., Seetharaman, G.: Geodesic active contour based fusion of visible and infrared video for persistent object tracking. In: Proceedings of IEEE Workshop on Applications of Computer Vision (2007)

    Google Scholar 

  2. Conaire, C.O., Connor, N.E., Cooke, E., Smeaton, A.F.: Comparison of fusion methods for thermo-visual surveillance tracking. In: Proceedings of International Conference on Information Fusion (2006)

    Google Scholar 

  3. Conaire, C.O., Connor, N.E., Smeaton, A.: Thermo-visual feature fusion for object tracking using multiple spatiogram trackers. Mach. Vis. Appl. 7, 1–12 (2007)

    Google Scholar 

  4. Cvejic, N., et al.: The effect of pixel-level fusion on object tracking in multi-sensor surveillance video. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2007)

    Google Scholar 

  5. Gade, R., Moeslund, T.B.: Thermal cameras and applications: a survey. Mach. Vis. Appl. 25, 245–262 (2014)

    Article  Google Scholar 

  6. Hare, S., Saffari, A., Torr, P.H.S.: Struck: structured output tracking with kernels. In: Proceedings of IEEE International Conference on Computer Vision (2011)

    Google Scholar 

  7. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2011)

    Article  Google Scholar 

  8. Leykin, A., Hammoud, R.: Pedestrian tracking by fusion of thermal-visible surveillance videos. Mach. Vis. Appl. 21(4), 587–595 (2010)

    Article  Google Scholar 

  9. Li, C., Cheng, H., Hu, S., Liu, X., Tang, J., Lin, L.: Learning collaborative sparse representation for grayscale-thermal tracking. IEEE Trans. Image Process. 25(12), 5743–5756 (2016)

    Article  MathSciNet  Google Scholar 

  10. Li, C., Wang, X., Zhang, L., Tang, J., Wu, H., Lin, L.: WELD: weighted low-rank decomposition for robust grayscale-thermal foreground detection. IEEE Trans. Circ. Syst. Video Technol. 27, 725–738 (2016). https://doi.org/10.1109/TCSVT20162556586

    Article  Google Scholar 

  11. Liu, H., Sun, F.: Fusion tracking in color and infrared images using joint sparse representation. Inf. Sci. 55(3), 590–599 (2012)

    MathSciNet  Google Scholar 

  12. Mei, X., Ling, H.: Robust visual tracking using l1 minimization. In: Proceedings of IEEE International Conference on Computer Vision (2009)

    Google Scholar 

  13. Wu, Y., Blasch, E., Chen, G., Bai, L., Ling, H.: Multiple source data fusion via sparse representation for robust visual tracking. In: Proceedings of International Conference on Information Fusion (2011)

    Google Scholar 

  14. Zhang, J., Ma, S., Sclaroff, S.: MEEM: robust tracking via multiple experts using entropy minimization. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8694, pp. 188–203. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10599-4_13

    Chapter  Google Scholar 

  15. Zhang, K., Zhang, L., Yang, M.-H.: Real-time compressive tracking. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7574, pp. 864–877. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33712-3_62

    Chapter  Google Scholar 

  16. Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via multi-task sparse learning. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2012)

    Google Scholar 

  17. Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2012)

    Google Scholar 

  18. Zhuang, B., Lu, H., Xiao, Z., Wang, D.: Visual tracking via discriminative sparse similarity map. IEEE Trans. Image Process. 23(4), 1872–1881 (2014)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgement

This work was supported in part by the Natural Science Foundation of Anhui Higher Education Institution of China under Grants KJ2017A017, and in part by the Co- Innovation Center for Information Supply & Assurance Technology, Anhui University under Grant Y01002449.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jin Tang .

Editor information

Editors and Affiliations

1 Electronic Supplementary Material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 60 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, L., Li, C., Tu, Z., Tang, J. (2018). A Fusion Approach to Grayscale-Thermal Tracking with Cross-Modal Sparse Representation. In: Wang, Y., Jiang, Z., Peng, Y. (eds) Image and Graphics Technologies and Applications. IGTA 2018. Communications in Computer and Information Science, vol 875. Springer, Singapore. https://doi.org/10.1007/978-981-13-1702-6_49

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-1702-6_49

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-1701-9

  • Online ISBN: 978-981-13-1702-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics