Skip to main content

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 104))

  • 879 Accesses

Abstract

Object tracking is an essential step of a video processing pipeline. In the Fish4Knowledge project, recognizing fish trajectories allows to provide information to higher-level modules, such as behavior understanding and population size estimation. However, video quality limitations and appearance/motion characteristics of fish make the task much more challenging than in typical human-based applications in urban contexts. To solve this problem, robust appearance and motion models must be employed: this chapter describes an approach devised to tackle the fish tracking problem in this project, and presents and evaluation of the tracking algorithm in comparison with state-of-the-art techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://c-fish.org/what-we-do/aquacam-research-programme/.

References

  • Babenko, B., M.-H. Yang, and S. Belongie. 2011. Robust object tracking with online multiple instance learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(8): 1619–1632.

    Google Scholar 

  • Bouaynaya, N., W. Qu, and D. Schonfeld. 2005. An online motion-based particle filter for head tracking applications. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing.

    Google Scholar 

  • Bradski, G.R. (1998). Computer vision face tracking for use in a perceptual user interface.

    Google Scholar 

  • Chau, D.P., F. Bremond, and M. Thonnat. 2009. Online evaluation of tracking algorithm performance. In The 3rd International Conference on Imaging for Crime Detection and Prevention.

    Google Scholar 

  • Comaniciu, D., and P. Meer. 2002. Mean shift: A robust approach toward feature space analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(5): 603–619.

    Article  Google Scholar 

  • Doucet, A., N. De Freitas, and N. Gordon. 2001. Sequential Monte Carlo methods in practice. Statistics for engineering and information science New York: Springer.

    Google Scholar 

  • Erdem, C.E., A.M. Tekalp, and B. Sankur. 2001. Metrics for performance evaluation of video object segmentation and tracking without ground truth. Proceedings of International Conference on Image Processing 2: 69–72.

    Google Scholar 

  • Förstner, W., and B. Moonen. 1999. A metric for covariance matrices. Technical report, Department of Geodesy and Geoinformatics, Stuttgart University.

    Google Scholar 

  • Porikli, F., O. Tuzel, and P. Meer. 2005. Covariance tracking using model update based on Lie algebra. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.

    Google Scholar 

  • San Biagio, M., M. Crocco, M. Cristani, S. Martelli, and V. Murino. 2013. Heterogeneous auto-similarities of characteristics (HASC): Exploiting relational information for classification. In International Conference on Computer Vision, 809–816.

    Google Scholar 

  • Shi, J., and C. Tomasi. 2008. Good features to track. In Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, 593–600.

    Google Scholar 

  • Spampinato, C., Y.-H. Chen-Burger, G. Nadarajan, and R. Fisher. 2008. Detecting, tracking and counting fish in low quality unconstrained underwater videos. In Proceedings of the 3rd International Conference on Computer Vision Theory and Applications (VISAPP), vol. 2, 514–519.

    Google Scholar 

  • Wu, H., and Q. Zheng. 2004. Self-evaluation for video tracking systems. In Proceedings of the 24th Army Science Conference, USA.

    Google Scholar 

  • Wu, H., A.C. Sankaranarayanan, and R. Chellappa. 2010. Online empirical evaluation of tracking algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(8): 1443–1458.

    Article  Google Scholar 

  • Yilmaz, A., X. Li, and M. Shah. 2004. Object contour tracking using level sets. In Proceedings of the Asian Conference on Computer Vision.

    Google Scholar 

  • Yilmaz, A., O. Javed, and M. Shah. 2006. Object tracking: a survey. ACM Computing Surveys 38(4): 13.1–13.45.

    Article  Google Scholar 

  • Zhou, S., R. Chellappa, and B. Moghaddam. 2003. Visual tracking and recognition using appearance-based modeling in particle filters. In Proceedings of the International Conference on Multimedia and Expo.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simone Palazzo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Giordano, D., Palazzo, S., Spampinato, C. (2016). Fish Tracking. In: Fisher, R., Chen-Burger, YH., Giordano, D., Hardman, L., Lin, FP. (eds) Fish4Knowledge: Collecting and Analyzing Massive Coral Reef Fish Video Data. Intelligent Systems Reference Library, vol 104. Springer, Cham. https://doi.org/10.1007/978-3-319-30208-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30208-9_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30206-5

  • Online ISBN: 978-3-319-30208-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics