Advertisement

Design of Face Tracking System Using Environmental Cameras and Flying Robot for Evaluation of Health Care

  • Veerachart SrisamosornEmail author
  • Noriaki Kuwahara
  • Atsushi Yamashita
  • Taiki Ogata
  • Jun Ota
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9745)

Abstract

This paper presents a face tracking system for evaluation of health care service for elderly people in a care house. As face can show patient’s smile and emotional response, it can be used for evaluation of the quality of health care and treatment provided to each patient, and therefore can be used to improve the quality of care. The conceptual system consists of cameras fixed in the environment to provide information about each person’s location and face direction, and moving cameras for tracking the faces. To prove the concept, a system with 5 fixed Kinects and a quadrotor was set up to cover the area and track one person The experiment shows that the system can control the quadrotor to follow the movements by the person. By attaching a wireless camera to the quadrotor, facial images can be obtained from the system, proving the validity of tracking.

Keywords

Health care evaluation Face tracking UAV Quadrotor 

Notes

Acknowledgments

This work was partially supported by JSPS KAKENHI Grant Number 15H01698.

References

  1. 1.
    Donabedian, A.: The quality of care: how can it be assessed? JAMA 260(12), 1743–1748 (1988). http://dx.doi.org/10.1001/jama.1988.03410120089033 CrossRefGoogle Scholar
  2. 2.
    Haritaoglu, I., Harwood, D., Davis, L.: W\(^4\): real-time surveillance of people and their activities. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 809–830 (2000)CrossRefGoogle Scholar
  3. 3.
    Zhao, T., Aggarwal, M., Kumar, R., Sawhney, H.: Real-time wide area multi-camera stereo tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, vol. 1, pp. 976–983, June 2005Google Scholar
  4. 4.
    Stillman, S., Tanawongsuwan, R., Essa, I.: A system for tracking and recognizing multiple people with multiple cameras. In: Proceedings of Second International Conference on Audio-Visionbased Person Authentication, pp. 96–101 (1998)Google Scholar
  5. 5.
    Wheeler, F., Weiss, R., Tu, P.: Face recognition at a distance system for surveillance applications. In: 2010 Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS), pp. 1–8, September 2010Google Scholar
  6. 6.
    Ali, B., Qureshi, A., Iqbal, K., Ayaz, Y., Gilani, S., Jamil, M., Muhammad, N., Ahmed, F., Muhammad, M., Kim, W.Y., Ra, M.: Human tracking by a mobile robot using 3d features. In: 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2464–2469, December 2013Google Scholar
  7. 7.
    Bellotto, N., Hu, H.: Multisensor-based human detection and tracking for mobile service robots. IEEE Trans. Syst. Man Cybern. Part B: Cybern. 39(1), 167–181 (2009)CrossRefGoogle Scholar
  8. 8.
    Vadakkepat, P., Lim, P., De Silva, L., Jing, L., Ling, L.L.: Multimodal approach to human-face detection and tracking. IEEE Trans. Ind. Electron. 55(3), 1385–1393 (2008)CrossRefGoogle Scholar
  9. 9.
    Srisamosorn, V., Kuwahara, N., Yamashita, A., Ogata, T., Ota, J.: Automatic face tracking system using quadrotors: Control by goal position thresholding. In: 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1314–1319, December 2014Google Scholar
  10. 10.
    Bitcraze AB Company: Bitcraze. http://www.bitcraze.io/. Accessed 25 Aug 2015
  11. 11.
    ROS.org — Powering the world’s robots. http://www.ros.org/. Accessed 25 Aug 2015
  12. 12.
    Dunkley, O.: GitHub omwdunkley/crazyflieROS, downloaded branch joyManager. http://github.com/omwdunkley/crazyflieROS. Accessed 15 April 2014
  13. 13.
    Field, T.: openni_tracker - ROS Wiki. http://wiki.ros.org/openni_tracker. Accessed 07 July 2015
  14. 14.
    Khoshelham, K., Elberink, S.O.: Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2), 1437 (2012). http://www.mdpi.com/1424-8220/12/2/1437 CrossRefGoogle Scholar
  15. 15.
    Maimone, A., Fuchs, H.: Reducing interference between multiple structured light depth sensors using motion. In: Virtual Reality Short Papers and Posters (VRW), pp. 51–54. IEEE, March 2012Google Scholar
  16. 16.
    Butler, A., Izadi, S., Hilliges, O., Molyneaux, D., Hodges, S., Kim, D.: Shake‘n’sense: reducing interference for overlapping structured light depth cameras. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, pp. 1933–1936. ACM (2012). http://research.microsoft.com/apps/pubs/default.aspx?id=171706

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Veerachart Srisamosorn
    • 1
    Email author
  • Noriaki Kuwahara
    • 2
  • Atsushi Yamashita
    • 1
  • Taiki Ogata
    • 3
  • Jun Ota
    • 3
  1. 1.Department of Precision Engineering, Graduate School of EngineeringThe University of TokyoBunkyo-kuJapan
  2. 2.Department of Advanced Fibro-ScienceKyoto Institute of TechnologyKyoto-shiJapan
  3. 3.Research into Artifacts, Center for Engineering (RACE)The University of TokyoKashiwa-shiJapan

Personalised recommendations