Advertisement

Service Robot Arm Controlled Just by Sight

  • Kohei AraiEmail author
Conference paper
Part of the Lecture Notes in Networks and Systems book series (LNNS, volume 69)

Abstract

Robot arm controlled by Eye Based Human-Computer Interaction: EBHCI is proposed. The proposed system allows disabled person to select desirable food from the meal tray by their eyes only as an example. Robot arm which is used for retrieving the desirable food is controlled by human eye. At the tip of the robot arm, tiny camera is equipped. Disabled person wears a glass of which a single Head Mount Display: HMD and tiny camera is mounted so that disabled person can look at the desired food and retrieve it by looking at the food displayed onto HMD. This is just an example. There are a plenty of available services as a magic arm. Experimental results show that disabled person can retrieve the desired food successfully. It also is confirmed that robot arm control by EBHCI is much faster than that by hands.

Keywords

Robot arm control Computer input just by sight 

Notes

Acknowledgment

The author would like to thank Dr. Ronny Mardiyanto of ITS, Indonesia for his great effort to conduct the experiments. Also, I went to a doctoral course graduate student in the Department of Intelligence and Information Systems Department of the Graduate School of Engineering at the University of Indonesia, the People’s Republic of China, and the Japanese who I became subjects as Kenro Yajima, the first half term doctoral course cooperated with the experiment. I express my deep appreciation.

References

  1. 1.
    Arai, K., Kamiaki, H.: Computer input based on eye-gaze estimation by detection of black eye center allowing user’s face movement. Inst. Electr. Eng. Japan Trans. C 127(7), 1107–1114 (2007)Google Scholar
  2. 2.
    Arai, K., Uwataki, H.: Computer input system based on viewing vector estimation with iris center detection from face image acquired with web camera allowing users’ movement. Electron. Commun. Japan 92(5), 31–40 (2009)CrossRefGoogle Scholar
  3. 3.
    Arai, K., Yamaura, M.: Improvement of blink detection accuracy in key determination of gaze input system by morphological filter. J. Inst. Image Electron. 37(5), 601–609 (2008)Google Scholar
  4. 4.
    Arai, K., Mardiyanto, R.: Camera mouse and keyboard for handicap person with trouble shooting capability, recovery and complete mouse events. Int. J. Hum. Comput. Interact. 1(3), 46–56 (2010)Google Scholar
  5. 5.
    Arai, K., Mardiyanto, R.: Real time blinking detection based on Gabor filter. Int. J. Hum. Comput. Interact. 1(3), 33–45 (2010)Google Scholar
  6. 6.
    Arai, K., Yamaura, M.: Computer input with human eyes only use two Purkinje images which work in a real time basis without calibration. Int. J. Hum. Comput. Interact. 1(3), 71–82 (2010)Google Scholar
  7. 7.
    Arai, K., Yajima, K.: Conversation support system with gaze input. J. Inst. Electr. Eng. Japan 128(11), 1679–1686 (2008)Google Scholar
  8. 8.
    Purwanto, D., Mardiyanto, R., Arai, K.: Electric wheel chair control with gaze detection and eye blinking. Artif. Life Robot. AROB J. 14(694), 397–400 (2009)CrossRefGoogle Scholar
  9. 9.
    Arai, K., Yajima, K.: Communication aid and computer input system with human eyes only. Electron. Commun. Japan 93(12), 1–9 (2010)CrossRefGoogle Scholar
  10. 10.
    Arai, K., Yajima, K.: Robot arm utilized having meal support system based on computer input by human eyes only. Int. J. Hum. Comput. Interact. 2(1), 120–128 (2011)Google Scholar
  11. 11.
    Yamaguchi, A., Ito, A., Kuroiwa, S., Oikawa, N., Matsuda, T., Ishii, S.: The clinical application of meal support robots. The 1: role of OT on the introduction to Guillain-Barre syndrome patients. The 2: application of a meal robot to neuromuscular disease. Ann. Res. Nerv. Mental Disord. 131–132 (2003)Google Scholar
  12. 12.
    Martens, C., Prenzel, O., Gräser, A.: The rehabilitation robots FRIEND-I & II: daily life independency through semi-autonomous task-execution, pp. 137–162. I-Tech Education and Publishing, Vienna (2007). ISBN 978-3-902613-04-2. http://intechweb.org/downloadpdf.php?id=556Google Scholar
  13. 13.
    Lüth, T., et al.: A brain-computer interface for controlling a rehabilitation robot. In: BCI Meets Robotics: Challenging Issues in Brain-Computer Interaction and Shared Control, Leuven, Belgium, pp. 19–20 (2003)Google Scholar
  14. 14.
  15. 15.
    Koya, Y., Oka, M., Akashi, T., Wakasa, Y., Tanaka, M.: Meal feeding robot system with ultrasonic motor. In: Proceedings of the SICE Chugoku-Brunch of General Assembly, vol. 16, pp. 116–117 (2007)Google Scholar
  16. 16.
    Kobayashi, H., Konishi, K., Kikuchi, K.: Development of feeding support system and its quantitative estimation. In: Proceedings of the Robotics and Mechatronics of the Society of Machinery of Japan, pp. 1A1-H-50 (2004)CrossRefGoogle Scholar
  17. 17.
    Goto, Y., Hatakeyama, T., Katayama, T.: Comparison of meal behavior by meal robot and caregiver. Occup. Ther. 27, 403–410 (2008)Google Scholar
  18. 18.
    Arai, K.: Self-Learning Textbook on Applied Linear Algebra - Theoretical Application Of Generalized Inverse Matrix. Modern Science Publication (2006)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Information ScienceSaga UniversitySaga CityJapan

Personalised recommendations