Advertisement

Collection and Classification of Gestures from People with Severe Motor Dysfunction for Developing Modular Gesture Interface

  • Ikushi YodaEmail author
  • Kazuyuki Itoh
  • Tsuyoshi Nakayama
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)

Abstract

This study explores gesture controlled user interfaces for people with severe motor function disabilities stemming from cerebral palsy, quadriplegia, and traumatic brain injury. As a result of their disabilities (involuntary movement and spasticity), it is nearly impossible for these individuals to use conventional interface switches and other input devices to access and use a computer. The ultimate objective of this work is to provide these users with user-friendly, cost-effective gesture controlled interfaces that will enable them to comfortably operate a personal computer. We have now succeeded in developing a non-contact, non-restraining interface based on an off-the-shelf image range sensor that recently became available. In addition, we surveyed a large number of disabled subjects and compiled a fairly exhaustive collection of gestures that these users are capable of making, and classified these voluntary movements in terms of the body part involved. Finally, a series of recognition modules have been developed that are optimized to recognize the gestures associated with each body part (hand, head, leg, etc.). This paper provides an overview of the gesture data collection and classification processes, and discusses the development of the recognition modules.

Keywords

Gesture interface Gesture recognition Alternative input device Persons with motor dysfunction 

Notes

Acknowledgment

Part of this work was supported by a Health and Labor Sciences Research Grants: Comprehensive Research on Disability Health and Welfare in 2014. The authors gratefully acknowledge the many who have supported and encouraged this work.

References

  1. 1.
    Yoda, I., Tanaka, J., Raytchev, B., Sakaue, K., Inoue, T.: Stereo camera based non-contact non-constraining head gesture interface for electric wheelchairs. In: Proceedings of International Conference of Pattern Recognition ICPR 2006, vol. 4, pp. 740–745 (2006)Google Scholar
  2. 2.
    Tanikawa, T., Yoda, I., et al.: Home environment models for comfortable and independent living of people with disabilities. J. Hum. Life Eng. 12(1), 23–27 (2011) (in Japanese)Google Scholar
  3. 3.
    Yoda, I., Nakayama, T., Ito, K.: Development of Interface for Cerebral Palsy Patient by Image Range Sensor. Grant Program Report of Tateishi Science and Technology Foundation, vol. 22, pp. 122–125 (2013) (in Japanese)Google Scholar
  4. 4.
    Iwabuchi, M., Guang, Y., Nakamura, K.: Computer vision for severe and multiple disabilities to interact the world. ITE Tech. Rep. 37(12), 47–50 (2013) (in Japanese)Google Scholar
  5. 5.
    Monekosso, D., Remagnino, D., Kuno, Y.: Intelligent environments: methods, algorithms and applications. In: Yoda, I., Sakaue, K. (eds.) Ubiquitous Stereo Vision for Human Sensing, Chapter 6. Advanced Information and Knowledge Processing, pp. 91–107. Springer, London (2009)Google Scholar
  6. 6.
    Hosotani, D., Yoda, D., Sakaue, K.: Wheelchair recognition by using stereo vision and histogram of oriented gradients, in real environments. In: IEEE Workshop on Applications of Computer Vision 2009, pp. 498–503 (2009)Google Scholar
  7. 7.
    Sato, N., Yoda, I., Inoue, T.: Shoulder gesture interface for operating electric wheelchair. In: IEEE International Workshop on Human-Computer Interaction in Conjunction with ICCV 2009, pp. 2048–2055 (2009)Google Scholar
  8. 8.
    Survey on persons with physical disability 2006, Department of Health and Welfare for Persons with Disabilities, Social Welfare and War Victims’ Relief Bureau, MHLW (2006)Google Scholar
  9. 9.
  10. 10.
  11. 11.
  12. 12.

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.National Institute of Advanced Industrial Science and Technology (AIST)TsukubaJapan
  2. 2.Research InstituteNational Rehabilitation Center for Persons with Disabilities (NRCD)TokorozawaJapan

Personalised recommendations