Skip to main content

Invited Paper: Multimodal Interface for an Intelligent Wheelchair

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 325))

Abstract

Since the demographics of population, with respect to age, are continuously changing, politicians and scientists start to pay more attention to the needs of senior individuals. Additionally, the well-being and needs of disabled individuals are also becoming highly valued in the political and entrepreneurial society. Intelligent wheelchairs are adapted electric wheelchairs with environmental perception, semi-autonomous behaviour and flexible human-machine-interaction. This paper presents the specification and development of a user-friendly multimodal interface, as a component of the IntellWheels Platform project. The developed prototype combines several input modules, allowing the control of the wheelchair through flexible user defined input sequences of distinct types (speech, facial expressions, head movements and joystick). To validate the effectiveness of the prototype, two experiments were performed with a number of individuals who tested the system firstly by driving a simulated wheelchair in a virtual environment. The second experiment was performed using the real IntellWheels wheelchair prototype. The results achieved proved that the multimodal interface may be successfully used by people, due to the interaction flexibility it provides.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Dayal, H.: Management of rehabilitation personnel within the context of the national rehabilitation policy (2009)

    Google Scholar 

  2. Braga, R., Petry, M., Moreira, A. P., Reis, L. P.: Concept and design of the intellWheels platform for developing intelligent wheelchairs. Inf. Control Autom. Robot. 191–203 (2009)

    Google Scholar 

  3. Sharma, R., Pavlovic, V. I., Huang, T. S.: Toward multimodal human computer interface. Proc. IEEE 86(5), 853–869 (1998)

    Google Scholar 

  4. Wang, H., Wang, Y., Cao, A.: Video-based face recognition: a survey. World Acad. Sci. Eng. Technol. 35(4), 293–302 (2009)

    Google Scholar 

  5. Tian, Y. L., Kanade, T., Cohn, J. F.: Handbook of face recognition. Stan, L., Anil, J., (eds.) pp. 247–274 (2005)

    Google Scholar 

  6. Sayette, M.A., Cohn, J.F., Wertz, J.M., Perrott, M.A., Parrott, D.J.: A psychometric evaluation of the facial action coding system for assessing spontaneous expression. J. Nonverbal Behav. 25, 167–185 (2001)

    Article  Google Scholar 

  7. Silva. L.: Head gestures recognition. In: Proceedings of the International Conference on Image Processing, vol. 3, pp. 266–269 (2002)

    Google Scholar 

  8. Gavankar, C., Warnekar, C.: Automated system for interpreting non-verbal communication in video conferencing. Int. J. Comput. Sci. Eng. (IJCSE) 2, 22–27 (2010)

    Google Scholar 

  9. Gips, J., Di Mattia, P., Curran, F.X., Olivieri, P.: Using eagle eyes—an electrodes based device for controlling the computer with your eyes to help people with special needs. In: Proceedings of the 5th International Conference on Computers Helping People with Special Needs. Part I, pp. 77–83. Munich, Germany (1996)

    Google Scholar 

  10. Ashwash, I., Hu, W., Marcotte, G.: Eye gestures recognition: a mechanism for hands-free computer control. Available at: http://www.cs.princeton.edu/courses/archive/ fall08/cos436/FinalReports/Eye\_Gesture\_Recognition.pdf. Accessed 2011

  11. Tall, M., Alapetite, A., Agustin, J.S., Skovsgaard, H.H.T., Hansen, J.P., Hansen, D.W.: Møllenbach, E.: Gaze-controlled driving. In: Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems. CHI’09, pp. 4387–4392. USA, ACM, New York (2009)

    Google Scholar 

  12. Jia, P., Hu, H.H., Lu, T., Yuan, K.: Head gesture recognition for hands-free control of an intelligent wheelchair. Ind. Rob. Int. J. 34(1), 60–68 (2007)

    Article  Google Scholar 

  13. Nakanishi, S., Kuno, Y., Shimada, N., Shirai, Y.: Robotic wheelchair based on observations of both user and environment. In: Proceedings of the International Conference on Intelligent Robots and Systems, vol. 2, pp. 912–917 (1999)

    Google Scholar 

  14. Matsumoto, Y., Ino, T., Ogasawara, T.: Development of intelligent wheelchair system with face and gaze based interface. In: Proceedings of the 10th IEEE International Workshop on Robot and Human Communication, pp. 262–267 (2001)

    Google Scholar 

  15. Ju, S., Shin, Y., Kim, Y.: Intelligent wheelchair (iw) interface using face and mouth recognition. In: Proceedings of the 13th International Conference on Intelligent User Interfaces, IUI’09, pp. 307–314. ACM (2009)

    Google Scholar 

  16. Manasse, P.: Speech recognition. Available at: http://aac.unl.edu/Speech_Recognition.html. Accessed on January 2011

  17. Youdin, M., Sell, G., Reich, T., Clagnaz, M., Louie, H., Kolwicz, R.: A voice controlled powered wheelchair and environmental control system for the severely disabled. Med Prog Technol 7, 139–143 (1980)

    Google Scholar 

  18. Sasou, A., Kojima, H.: Noise robust speech recognition applied to voice-driven wheelchair. EURASIP J. Adv. Sig. Proces. 41, 1–41:1 (2009)

    Google Scholar 

  19. Cyber Glove Systems. Cyber glove ii. Available at: http://www.cyberglovesystems.com/products/cyberglove-ii/overview. Accessed on Nov 2011

  20. AnthroTronix.: The AcceleGlove—capturing hand gesture in virtual reality. Available at: http://www.gizmag.com/go/2134/. Accessed on Jan 2011

  21. Microsoft.: Kinect for xbox 360. Available at: http://www.xbox.com/pt-PT/kinect. Accessed on May 2011

  22. LURCH Project.: Lurch—the autonomous wheelchair. Available at: http://airwiki.ws.dei.polimi.it/index.php/LURCH_The_autonomous_wheelchair. Accessed on May 2011

  23. Philips, J., Millan, J., Vanacker, G., Lew, E., Galán, F., Ferrez, P., Van Brussel, H., Nuttin, M.: Adaptive shared control of a brain-actuated simulated wheelchair. In: Proceedings of the 10th IEEE International Conference on Rehabilitation Robotics, pp. 408–414. Noordwijk, The Netherlands, 6 (2007)

    Google Scholar 

  24. Blatt, R., Ceriani, S., Seno, B.D., Fontana, G., Matteucci, M., Migliore, D.: Brain control of a smart wheelchair. In 10th International Conference on Intelligent Autonomous Systems (2008)

    Google Scholar 

  25. Rebsamen, B., Burdet, E., Guan, C., Zhang, H., Teo, C. L., Zeng, Q., Laugier, C., Ang Jr., M. H.: Controlling a wheelchair indoors using thought. IEEE Intel. Sys. 22, 18–24 (2007)

    Google Scholar 

  26. Mahl, C., Hayrettin, G., Danny, P., Marieke, E., Lasse, S., Matthieu, D., Alexandra, A.: Bacteriahunt: evaluating multi-paradigm BCI interaction. J. Multimodal User Interfaces 4(1), 11–25 (2010)

    Article  Google Scholar 

  27. Shepherd: How Shepherd Center works. Discovery & Fit Health. Available at: http://health.howstuffworks.com/medicine/healthcare-providers/shepherd-center5.htm. Accessed 2010

  28. Oviatt, S.: A Handbook of Human-Computer Interaction. In: Jacko J., Sears, A. (eds.) New Jersey (2002)

    Google Scholar 

  29. Dumas, B., Lalanne, D., Oviatt, S.: Human Machine Interaction, vol. 5440 (Chap. Multimodal interfaces: a survey of principles, models and frameworks, pp. 3–26. Springer, Berlin (2009)

    Google Scholar 

  30. Bolt, R. A.: "Put-that-there": voice and gesture at the graphics interface. In: Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, pp. 262–270. New York, USA (1980)

    Google Scholar 

  31. Johnston, M., Bangalore, S.: Matchkiosk: a multimodal interactive city guide. In: Proceedings of the ACL 2004 on Interactive Poster and Demonstration Sessions. Barcelona, Spain, Article No. 33 (2004)

    Google Scholar 

  32. Johnston, M., Cohen, Philip R., McGee, D., Oviatt, S. L., Pittman, J. A., Smith, I.: Unification-based multimodal integration. In: Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and Eighth Conference of the European Chapter of the Association for Computational Linguistics, ACL-35, pp. 281–288. Morristown, NJ, U.S.A. (1997)

    Google Scholar 

  33. Karpov, A., Ronzhin, A.: ICANDO: low cost multimodal interface for hand disabled people. J. Multimodal User Interfaces 1(2), 21–29 (2007)

    Article  Google Scholar 

  34. Norman, D.: The Psychology of Everyday Things. Basic Books, New York. Open Interface. Open interface platform. Available at: http://www.openinterface.org/platform/. Accessed on May 2011

  35. Braga, R., Petry, M., Moreira, A., Reis, L. P.: A development platform for intelligent wheelchairs for disabled people. In: 5th International Conference on Informatics in Control, Automation and Robotics 1, 115–121 (2008)

    Google Scholar 

  36. Reis, L.P., Braga, R., Sousa, M., Moreira, A.: Intellwheels MMI: a flexible interface for an intelligent wheelchair. In: Baltes, J., Lagoudakis, M.G., Naruse, T., ShiryGhidary, S. (eds) RoboCup, vol. 5949. Lecture Notes in Computer Science, pp. 296–307. Springer, Berlin (2009)

    Google Scholar 

  37. Braga, R., Petry, M., Moreira, A., Reis, L. P.: Platform for intelligent wheelchairs using multi-level control and probabilistic motion model. In: 8th Portuguese Conference on Automatic Control, pp. 833–838 (2008)

    Google Scholar 

  38. Banzi, M., Cuartielles, D., Igoe, T., Martino, G., Mellis D.: Arduino. Available at: http://arduino.cc/. Accessed 2011

  39. Braga, R., Malheiro, P., Reis, L.P.: Development of a realistic simulator for robotic intelligent wheelchairs in a hospital environment. RoboCup2009: Robot Soccer World Cup XIII. vol. 5949. Lecture Notes in Computer Science, pp. 23–34. Springer, Berlin (2010)

    Google Scholar 

  40. Lau, N., Pereira, A., Melo, A., Neves, A., Figueiredo, J.: Ciber-rato: Umambientedesimulaçãoderobotsmóveiseautónomos. RevistadoDETUA 3(7), 647–650 (2002)

    Google Scholar 

  41. Vasconcelos, S.: Multimodal Interface for an intelligent wheelchair. University of Porto, Faculty of Engineering. Retrieved April, 21, 2012, from http://hdl.handle.net/10216/62054

  42. Embarcadero. Available at: http://www.embarcadero.com/products/delphi. Accessed on May 2011

  43. SAPI: http://www.microsoft.com/en-us/tellme/. Accessed on Jan 2012

  44. Nintendo: Wii controllers. Available at: http://nintendo.com/wii/console/controllers. Consulted on May 2011. Accessed on May 2011

  45. Black, P.: Binary search algorithm. Available at: http://xw2k.nist.gov/dads/HTML/binarySearch.html. Accessed on Jan 2011

  46. Brooke, J.: SUS: aquick and dirty usability scale. In: Jordan, P.W., Weerdmeester, B., Thomas, A., Mclelland, I.L. (eds.) Usability Evaluation in Industry. Taylor and Francis, London (1996)

    Google Scholar 

  47. Ubisense: Precise real-time location. Available at: http://www.ubisense.net/en/products/precise-real-time-location.html. Accessed on May 2011

Download references

Acknowledgments

The authors would like to acknowledge to FCT—Portuguese Science and Technology Foundation for the INTELLWHEELS project funding (RIPD/ADA/109636/2009), for the Ph.D. Scholarship FCT/SFRH/BD/44541/2008, LIACC—Laboratório de Inteligência Artificial e de Ciência de Computadores da Universidade do Porto, DETI/UA—Dep. Electrónica, Telecomunicações e Informática, IEETA—Instituto de Engenharia Electrónica e Telemática de Aveiro and ESTSP/IPP—Escola Superior de Tecnologia da Saúde Porto—IPP.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luís Paulo Reis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Reis, L.P., Faria, B.M., Vasconcelos, S., Lau, N. (2015). Invited Paper: Multimodal Interface for an Intelligent Wheelchair. In: Ferrier, JL., Gusikhin, O., Madani, K., Sasiadek, J. (eds) Informatics in Control, Automation and Robotics. Lecture Notes in Electrical Engineering, vol 325. Springer, Cham. https://doi.org/10.1007/978-3-319-10891-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10891-9_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10890-2

  • Online ISBN: 978-3-319-10891-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics