Real-time Application for Monitoring Human Daily Activity and Risk Situations in Robot-Assisted Living

  • Mário VieiraEmail author
  • Diego R. Faria
  • Urbano Nunes
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 418)


In this work, we present a real-time application in the scope of human daily activity recognition for robot-assisted living as an extension of our previous work [1]. We implemented our approach using Robot Operating System (ROS) environment, combining different modules to enable a robot to perceive the environment using different sensor modalities. Thus, the robot can move around, detect, track and follow a person to monitor daily activities wherever the person is. We focus our attention mainly on the robotic application by integrating several ROS modules for navigation, activity recognition and decision making. Reported results show that our framework accurately recognizes human activities in a real time application, triggering proper robot (re)actions, including spoken feedback for warnings and/or appropriate robot navigation tasks. Results evidence the potential of our approach for robot-assisted living applications.


Mobile Robot Activity Recognition Risk Situation Human Action Recognition Robot Operating System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Faria, D.R., Vieira, M., Premebida, C., Nunes, U.: Probabilistic human daily activity recognition towards robot-assisted living. In: IEEE RO-MAN 2015 (2015)Google Scholar
  2. 2.
    Faria, D.R., Premebida, C., Nunes, U.: A probalistic approach for human everyday activities recognition using body motion from RGB-D images. In: IEEE RO-MAN 2014, * Kazuo Tanie Award Finalist (2014)Google Scholar
  3. 3.
    Zhu, C., Sheng, W.: Realtime human daily activity recognition through fusion of motion and location data. In: IEEE International Conference on Information and Automation (2010)Google Scholar
  4. 4.
    Zhu, C., Sheng, W.: Human daily activity recognition in robot-assisted living using multi-sensor fusion. In: IEEE ICRA 2009 (2009)Google Scholar
  5. 5.
    Microsoft kinect. (accessed on June 2015)
  6. 6.
    Asus xtion. (accessed on June 2015)
  7. 7.
    Papadopoulos, G.T., Axenopoulos, A., Daras, P.: Real-time skeleton-tracking-based human action recognition using kinect data. In: 3D and Augmented Reality, pp. 473–483. Springer International Publishing (2014)Google Scholar
  8. 8.
    Chen, C., Liu, K., Kehtarnavaz, N.: Real-time human action recognition based on depth motion maps. Journal of Real-Time Image Processing (2013)Google Scholar
  9. 9.
    Sung, J., Ponce, C., Selman, B., Saxena, A.: Unstructured human activity detection from RGBD images. In: ICRA 2012 (2012)Google Scholar
  10. 10.
    Xia, L., Aggarwal, J.: Spatio-temporal depth cuboid similarity feature for activity recognition using depth camera. In: CVPR (2013)Google Scholar
  11. 11.
    Zhu, Y., Chen, W., Guo, G.: Evaluating spatiotemporal interest point features for depth-based action recognition. Image and Vision Computing (2014)Google Scholar
  12. 12.
    Mehdi, S.A., Armbrust, C., Koch, J., Berns, K.: Methodology for robot mapping and navigation in assisted living environments. In: 2nd International Conference on Pervasive Technologies Related to Assistive Environments (2009)Google Scholar
  13. 13.
    Koppula, H.S., Gupta, R., Saxena, A.: Learning human activities and object affordances from RGB-D videos. IJRR Journal (2012)Google Scholar
  14. 14.
    Volkhardt, M., Müller, S., Schröter, C., Gross, H.-M.: Real-time activity recognition on a mobile companion robot. In: 55th Int. Scientific Colloquium (2010)Google Scholar
  15. 15.
    Arsigny, V., Fillard, P., Pennec, X., Ayache, N.: Log-euclidean metrics for fast and simple calculus on diffusion tensors. Magnetic Resonance in Medicine 56(2), 411–421 (2006)CrossRefGoogle Scholar
  16. 16.
    Guo, K.: Action recognition using log-covariance matrices of silhouette and optical-flow features, Ph.D. dissertation, Boston University, College of Engineering (2012)Google Scholar
  17. 17.
    Chang, C.-C., Lin, C.-J.: LIBSVM: A library for support vector machines. ACM TIST (2011).

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Electrical and Computer Engineering, Institute of Systems and RoboticsUniversity of Coimbra, Polo IICoimbraPortugal

Personalised recommendations