Skip to main content

Human Activity Recognition for Domestic Robots

  • Chapter
Field and Service Robotics

Part of the book series: Springer Tracts in Advanced Robotics ((STAR,volume 105))

Abstract

Capabilities of domestic service robots could be further improved, if the robot is equipped with an ability to recognize activities performed by humans in its sensory range. For example in a simple scenario a floor cleaning robot can vacuum the kitchen floor after recognizing human activity ”cooking in the kitchen”. Most of the complex human activities can be sub divided into simple activities which can later used for recognize complex activities. Activities like ”take meditation” can be sub divided into simple activities like ”opening pill container” and ”drinking water”. However, even recognizing simple activities are highly challenging due to the similarities between some inter activities and dissimilarities of intra activities which are performed by different people, body poses and orientations. Even a simple human activity like ”drinking water” can be performed while the subject is in different body poses like sitting, standing or walking. Therefore building machine learning techniques to recognize human activities with such complexities is non trivial. To address this issue, we propose a human activity recognition technique that uses 3D skeleton features produced by a depth camera. The algorithm incorporates importance weights for skeleton 3D joints according to the activity being performed. This allows the algorithm to ignore the confusing or irrelevant features while relying on informative features. Later these joints were ensembled together to train Dynamic Bayesian Networks (DBN), which is then used to infer human activities based on likelihoods. The proposed activity recognition technique is tested on a publicly available dataset and UTS experiments with overall accuracies of 85% and 90%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brand, M., Oliver, N., Pentland, A.: Coupled hidden Markov models for complex action recognition. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 994–999 (1997), doi:10.1109/CVPR.1997.609450

    Google Scholar 

  2. Demiris, Y., Meltzoff, A.: The robot in the crib: A developmental analysis of imitation skills in infants and robots. Infant Child Dev. 17(1), 43–53 (2008)

    Article  Google Scholar 

  3. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the em algorithm. Journal of the Royal Statistical Society, Series B 39(1), 1–38 (1977)

    MATH  MathSciNet  Google Scholar 

  4. Stork, J.A., Spinello, L., Silva, J., Arras, K.O.: Audio-based human activity recognition using non-markovian ensemble voting. In: Proc. of IEEE International Symposium on Robot and Human Interactive Communication, RoMan (2012)

    Google Scholar 

  5. Lopes, M., Melo, F.S., Montesano, L.: Affordance-based imitation learning in robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, USA, pp. 1015–1021 (2007)

    Google Scholar 

  6. Martinez-Contreras, F., Orrite-Urunuela, C., Herrero-Jaraba, E., Ragheb, H., Velastin, S.A.: Recognizing Human Actions Using Silhouette-based HMM. In: 2009 Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance, pp. 43–48 (2009)

    Google Scholar 

  7. Murphy, K.: Dynamic bayesian networks: Representation, inference and learning. Ph.D. thesis, UC Berkeley, Computer Science Division (2002)

    Google Scholar 

  8. Murphy, K.P.: The bayes net toolbox for matlab. Computing Science and Statistics 33 (2001)

    Google Scholar 

  9. Piyathilaka, L., Kodagoda, S.: Gaussian mixture based hmm for human activity recognition uisng 3d skeleton features. In: 8th IEEE Conference on Industrial Electronics and Applications (2013)

    Google Scholar 

  10. Rabiner, L.R.: A tutorial on hidden markov models and selected applications in speech recognition. Proceedings of the IEEE 77(2), 257–286 (1989), doi:10.1109/5.18626

    Article  Google Scholar 

  11. Sung, J., Ponce, C., Selman, B., Saxena, A.: Human activity detection from rgbd images. In: Plan, Activity, and Intent Recognition, vol. WS-11-16. AAAI (2011)

    Google Scholar 

  12. Theodoridis, T., Agapitos, A., Hu, H., Lucas, S.M.: Ubiquitous robotics in physical human action recognition: A comparison between dynamic anns and gp. In: ICRA, pp. 3064–3069. IEEE (2008)

    Google Scholar 

  13. Wu, Y., Yuan, J., Liu, Z., Wang, J.: Mining actionlet ensemble for action recognition with depth cameras. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1290–1297 (2012), doi: http://doi.ieeecomputersociety.org/10.1109/CVPR.2012.6247813

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lasitha Piyathilaka .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Piyathilaka, L., Kodagoda, S. (2015). Human Activity Recognition for Domestic Robots. In: Mejias, L., Corke, P., Roberts, J. (eds) Field and Service Robotics. Springer Tracts in Advanced Robotics, vol 105. Springer, Cham. https://doi.org/10.1007/978-3-319-07488-7_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07488-7_27

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07487-0

  • Online ISBN: 978-3-319-07488-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics