Advertisement

Automated recognition and discrimination of human–animal interactions using Fisher vector and hidden Markov model

  • Jian LianEmail author
  • Yuanjie ZhengEmail author
  • Weikuan Jia
  • Yanna Zhao
  • Mingqu Fan
  • Dongwei Wang
  • Shuqi Shang
Original Paper
  • 7 Downloads

Abstract

Human–animal interactions may affect the animal welfare and productivity in rearing environments. Previously proposed human–animal-related techniques focus on the manual discrimination of single animal behaviors or simple human–animal interactions. To address the automatic detection and classification of complex animal behaviors and the animals reactions to human, we propose an approach built upon both the visual representation with Fisher vectors and the end-to-end generative hidden Markov model to facilitate the discrimination of both coarse- and fine-grained animal–human interactions. To satisfy the requirement for abundant data samples of the generative approach, we recorded and annotated more than 480 hours of videos featuring eight persons and 210 laying hens during the process of feeding and cleaning. The experimental results show that the proposed method outperforms state-of-the-art approaches. According to the experimental performance of our method on practical videos, our approach can be used to monitor the human–animal interactions or animal behaviors in modern poultry farms.

Keywords

Classification Machine vision Animal behavior Artificial intelligence 

Notes

Acknowledgements

The authors sincerely thank the editors and reviewers for their work.

Author Contributions

Y. Zheng conceived of the study and designed the experiments. JL, WJ and Y. Zhao performed the experiments, MF, DW and SS analyzed the data. JL wrote the paper. All authors helped revise and approved the manuscript.

Funding

This work was made possible through support from the Natural Science Foundation of China (61572300), Taishan Scholar Program of Shandong Province in China (TSHW201502038) and SDUST Excellent Teaching Team Construction Plan JXTD20160512.

Compliance with ethical standards

Conflict of interest

The authors declare no conflict of interest. The funding sponsors had no role in the design of the study; in the collection, analyses or interpretation of the data; in the writing of the manuscript, or in the decision to publish the results.

Ethical standard

This study was approved by the Animal Care and Use Committee of Qingdao Agricultural University (Qingdao, China).

Availability of data and material

The datasets analyzed during the current study are available from the corresponding authors upon reasonable request.

References

  1. 1.
    Alaqil, A., Zulkifli, I., Hair, B.M., Sazili, A.Q., Rajion, M.A., Somchit, M.N.: Changes in heat shock protein 70, blood parameters, and fear-related behavior in broiler chickens as affected by pleasant and unpleasant human contact. Poult. Sci. 92(1), 33–40 (2013)CrossRefGoogle Scholar
  2. 2.
    Aydin, A.: Development of an early detection system for lameness of broilers using computer vision. Comput. Electron. Agric. 136, 140–146 (2017)CrossRefGoogle Scholar
  3. 3.
    Aydin, A.: Using 3d vision camera system to automatically assess the level of inactivity in broiler chickens. Comput. Electron. Agric. 135, 4–10 (2017)CrossRefGoogle Scholar
  4. 4.
    Cinbis, R.G., Verbeek, J.J., Schmid, C.: Segmentation driven object detection with fisher vectors. 2968–2975 (2013)Google Scholar
  5. 5.
    Davis, H., Taylor, A.: Discrimination between individual humans by domestic fowl (gallus gallus domesticus). Br. Poult. Sci. 42(2), 276–279 (2001)CrossRefGoogle Scholar
  6. 6.
    Dawkins, M.S.: From an animal’s point of view: motivation, fitness, and animal welfare. Behav. Brain Sci. 13(1), 1–9 (1990)CrossRefGoogle Scholar
  7. 7.
    Forman, G, Scholz, M.B., Rajaram, S.S.,: Feature shaping for linear svm classifiers. 299–308 (2009)Google Scholar
  8. 8.
    Forney, G.D.: The Viterbi algorithm. Proc. IEEE. 61(3), 268–278 (1973)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Jones, R.B.: Fear and adaptability in poultry: insights, implications and imperatives. Worlds Poult. Sci. J. 52(02), 131–174 (1996)CrossRefGoogle Scholar
  10. 10.
    Ke, Y., Sukthankar, R.: Pca-sift: a more distinctive representation for local image descriptors. 2, 506–513 (2004)Google Scholar
  11. 11.
    Lao, F., Brown-Brandl, T., Stinn, J.P., Liu, K., Teng, G., Xin, H.: Automatic recognition of lactating sow behaviors through depth image processing. Comput. Electron. Agric. 125, 56–62 (2016)CrossRefGoogle Scholar
  12. 12.
    Linden, D.V.D., Zamansky. A.: Agile with animals: towards a development method. In: IEEE International Requirements Engineering Conference Workshops, 423–426 (2017)Google Scholar
  13. 13.
    Mancini, C.: Animal-computer interaction: a manifesto. interactions 18(4), 69–73 (2011)Google Scholar
  14. 14.
    Mehdizadeh, S.A., Neves, D.P., Tscharke, M., Naas, I.D.A., Banhazi, T.: Image analysis method to evaluate beak and head motion of broiler chickens during feeding. Comput. Electron. Agric. 114, 88–95 (2015)CrossRefGoogle Scholar
  15. 15.
    Nakarmi, A.D., Tang, L., Xin, H.: Automated tracking and behavior quantification of laying hens using 3d computer vision and radio frequency identification technologies. Trans. ASABE. 57(5), 1455–1472 (2014)Google Scholar
  16. 16.
    Nasirahmadi, A., Richter, U., Hensel, O., Edwards, S.A., Sturm, B.: Using machine vision for investigation of changes in pig group lying patterns. Comput. Electron. Agric. 119, 184–190 (2015)CrossRefGoogle Scholar
  17. 17.
    Passill, A.M.D., Rushen, J.: Can we measure human–animal interactions in on-farm animal welfare assessment? some unresolved issues. Appl. Anim. Behav. Sci. 92(3), 193–209 (2005)CrossRefGoogle Scholar
  18. 18.
    Pereira, D.F., Miyamoto, B.C.B., Maia, G.D.N., Sales, G.T., De Magalhaes, M.M., Gates, R.S.: Machine vision to identify broiler breeder behavior. Comput. Electron. Agric. 99, 194–199 (2013)CrossRefGoogle Scholar
  19. 19.
    Perronnin, F., Larlus, D.: Fisher vectors meet neural networks: A hybrid classification architecture, 3743–3752 (2015)Google Scholar
  20. 20.
    Peters, S.M., Pinter, I.J., Pothuizen, H.H.J., De Heer, R.C., Der Harst, J.E.V., Spruijt, B.M.: Novel approach to automatically classify rat social behavior using a video tracking system. J. Neurosci. Methods. 268, 163–170 (2016)CrossRefGoogle Scholar
  21. 21.
    Pons, P., Jaen, J., Catala, A.: Assessing machine learning classifiers for the detection of animals behavior using depth-based tracking. Expert Syst. Appl. 86, 235–246 (2017)CrossRefGoogle Scholar
  22. 22.
    Sanchez, J., Perronnin, F., Mensink, T., Verbeek, J.J.: Image classification with the fisher vector: theory and practice. Int. J. Comput. Vision. 105(3), 222–245 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Wang, C., Chen, H., Zhang, X., Meng, C.: Evaluation of a laying-hen tracking algorithm based on a hybrid support vector machine. J. Anim. Sci. Biotechnol. 8(1), 226–235 (2017)Google Scholar
  24. 24.
    Welch, Lloyd R.: Hidden markov models and the Baum-Welch algorithm. IEEE Inf. Theory Soc. Newsl. 53(2), 194–211 (2003)Google Scholar
  25. 25.
    Yang, J., Zhang, D., Frangi, A.F., Yang, J.: Two-dimensional pca: a new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 131–137 (2004)CrossRefGoogle Scholar
  26. 26.
    Yang, M., Ahuja, N.: Gaussian mixture model for human skin color and its applications in image and video databases. Proc. SPIE. 3656, 458–466 (1998)CrossRefGoogle Scholar
  27. 27.
    Young, S., Evermann, G., Gales, M., Kershaw, D., Moore, G., Odell, J., Ollason, D., Povey, D., Valtchev, V., Woodland, P.: The htk book version 3.4. Cambridge University Engineering Department, 2006Google Scholar
  28. 28.
    Zamansky, A., Roshier, A., Mancini, C., Collins, E.C., Hall, C., Grillaert, K., Morrison, A., North, S., Wirman, H.: A report on the first international workshop on research methods in animal-computer interaction. In: CHI Conference Extended Abstracts on Human Factors in Computing Systems, 806–815 (2017)Google Scholar
  29. 29.
    Zhou, H., Yuan, Y., Shi, C.: Object tracking using sift features and mean shift. Comput. Vision Image Underst. 113(3), 345–352 (2009)CrossRefGoogle Scholar
  30. 30.
    Zulkifli, I.: Review of human–animal interactions and their impact on animal productivity and welfare. J. Anim. Sci. Biotechnol. 4(3), 25 (2013)CrossRefGoogle Scholar
  31. 31.
    Zulkifli, I., Azah, A.S.N.: Fear and stress reactions, and the performance of commercial broiler chickens subjected to regular pleasant and unpleasant contacts with human being. Appl. Anim. Behav. Sci. 88(1), 77–87 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Electrical Engineering and Information TechnologyShandong University of Science and TechnologyJinanChina
  2. 2.School of Information Science and Engineering, Key Lab of Intelligent Computing and Information Security at Universities of Shandong, Institute of Life Sciences, Shandong Provincial Key Laboratory for Distributed Computer Software Novel Technology, and Key Lab of Intelligent Information ProcessingShandong Normal UniversityJinanChina
  3. 3.School of Mechanical and Electrical EngineeringQingdao Agricultural UniversityQingdaoChina

Personalised recommendations