Advertisement

AR Based User Interface for Driving Electric Wheelchairs

  • Shigeyuki Ishida
  • Munehiro Takimoto
  • Yasushi Kambayashi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10278)

Abstract

Todays, the electric wheelchair has been an essential tool for handicapped people. It may be difficult, however for hand-impaired people to operate the electric wheelchair with the conventional joystick. For hand-impaired people, we have to provide other operations such as eye-tracking. Traditional eye-tracking method, however, is tiresome, because the user has to fix his or her eyes to the direction of the destination. In this paper, we propose an user interface with eye tracking extended by the augmented reality (AR) technology. In our user interface, the sight in front of the electric wheelchair is displayed on the PC screen. Furthermore, an oval is overlaid at user’s view point in the sight on the screen as if a search light spots focus at the specified location. The user can decide the location at the oval as a temporary destination, to which the electric wheelchair moves. Repeating the process, the user can intuitively drive his or her electric wheelchair to the final destination without the burden of operations.

Keywords

Electric wheelchair Eye tracking Augmented reality 

References

  1. 1.
    Arai, K., Mardiyanto, R.: A prototype of electric wheelchair controlled by eye-only for paralyzed user. JRM 23(1), 66–74 (2011)CrossRefGoogle Scholar
  2. 2.
    Al-Haddad, A., Sudirman, R., Camallil, O.: Gaze at desired destination, and wheelchair will navigate towards it.new technique to guide wheelchair motion based on EOG signals. In: First International Conference on Informatics and Computational Intelligence (2011)Google Scholar
  3. 3.
    Miller, D.P., Slack, M.G.: Design and testing of a lo-cost robotic wheelchair prototype. Auton. Robot. 2, 77–80 (1995)CrossRefGoogle Scholar
  4. 4.
    Nakanishi, S., Kuno, Y.: Robotic wheelchair based on observations of both user and environment. IEEE Intell. Robot. Syst. 2, 177–188 (2002)Google Scholar
  5. 5.
    Eid, A.M., Giakoumidis, N.: A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS. IEEE Access 4, 558–573 (2016)CrossRefGoogle Scholar
  6. 6.
    Takahashi, K., Nakazawa, M., Abe, T.: Wheelchair robot control system using EEG. IPSJ Interact. (2015). (in Japanese)Google Scholar
  7. 7.
    Adachi, Y., Shimada, N.: Intelligent wheelchair using visual information from the human face. IEEE Intell. Robot. Syst. 1, 354–359 (1999)Google Scholar
  8. 8.
    Terashima, S., Aida, N.: Development of operation device for electric wheelchair by ocular movement mechanical & control engineering. In: Proceedings of the ... JSME Conference on Frontiers in Bioengineering, pp. 1348–2939 (2006)Google Scholar
  9. 9.
    Ino, T., Matsumoto, Y.: Development of intelligent wheelchair system with face and gaze based interface. In: Robot and Human Interactive Communication, pp. 262–267 (2001)Google Scholar
  10. 10.
    Katevas, N.I., Sgouros, N.M.: The autonomous mobile robot scenario: a sensor aided intelligent navigation system for powered wheelchairs. IEEE Robot. 4, 60–70 (2002)CrossRefGoogle Scholar
  11. 11.
    Simpson, R.C., Levine, S.P., Bell, D.A., Jaros, L.A., Koren, Y., Borenstein, J.: NavChair: an assistive wheelchair navigation system with automatic adaptation. In: Mittal, V.O., Yanco, H.A., Aronis, J., Simpson, R. (eds.) Assistive Technology and Artificial Intelligence. LNCS, vol. 1458, pp. 235–255. Springer, Heidelberg (1998). doi: 10.1007/BFb0055982 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Shigeyuki Ishida
    • 1
  • Munehiro Takimoto
    • 1
  • Yasushi Kambayashi
    • 2
  1. 1.Department of Information SciencesTokyo University of ScienceNoda-shiJapan
  2. 2.Department of Computer and Information EngineeringNippon Institute of TechnologySaitamaJapan

Personalised recommendations