Advertisement

A Context-Aware Voice Operated Mobile Guidance System for Visually Impaired Persons

  • Kavi Kumar KhedoEmail author
  • Kishan Yashveer Bhugul
  • David Young Ten
Chapter
Part of the EAI/Springer Innovations in Communication and Computing book series (EAISICC)

Abstract

The most important travelling aid for the visually impaired person is still the white cane which is multifunctional, cheap and reliable. In studies about visually impaired person navigation, it has been noted that even a small amount of extra information about the environment has a remarkable positive impact for those persons. In this chapter, a mobile application named Mobile Vision is described. The Mobile Vision application, developed at the University of Mauritius, is an innovative android application that is used to augment a visually impaired person’s pedestrian experience with enough information so as to ease his movement from one location to another. Innovative interaction mechanisms have been developed in Mobile Vision to allow visually impaired persons to use the mobile phone. Using pre-loaded maps on the mobile device, environmental conditions and landmark information along their route are provided on the fly through simple explanatory voice cues. The application can advise the user where he/she is currently located and provide spoken directions to travel to a particular destination.

Keywords

Mobile application Context awareness Navigation system Mobile user interfaces Voice interactions 

References

  1. Cardin, S., Vexo, F., & Thalmann, D. (2005). Wearable obstacle detection system for visually impaired people. In VR workshop on haptic and tactile perception of deformable objects (No. VRLAB-CONF-2005-019, p. 50).Google Scholar
  2. Dey, A. K., Abowd, G. D., & Wood, A. (1998). CyberDesk: A framework for providing self-integrating context-aware services. Knowledge-Based Systems, 11(1), 3–13.CrossRefGoogle Scholar
  3. Ehrlich, J. R., Ojeda, L. V., Wicker, D., Day, S., Howson, A., Lakshminarayanan, V., & Moroi, S. E. (2017). Head-mounted display technology for low-vision rehabilitation and vision enhancement. American Journal of Ophthalmology, 176, 26–32.CrossRefGoogle Scholar
  4. Fallah, N., Apostolopoulos, I., Bekris, K., & Folmer, E. (2013). Indoor human navigation systems: A survey. Interacting with Computers, 25(1), 21–33.Google Scholar
  5. Helal, A., Moore, S. E., & Ramachandran, B. (2001). Drishti: An integrated navigation system for visually impaired and disabled. In Proceedings of the Fifth International Symposium on Wearable Computers, 2001 (pp. 149–156). Piscataway, NJ: IEEE.CrossRefGoogle Scholar
  6. Koley, S., & Mishra, R. (2012). Voice operated outdoor navigation system for visually impaired persons. International Journal of Engineering Trends and Technology, 3(2), 153–157.Google Scholar
  7. Pal, J., Viswanathan, A., Chandra, P., Nazareth, A., Kameswaran, V., Subramonyam, H., … O’Modhrain, S. (2017). Agency in assistive technology adoption: Visual impairment and smartphone use in Bangalore. In G. Mark, S. Fussell, C. Lampe, M. C. Schraefel, J. P. Hourcade, C. Appert, & D. Wigdor (Eds.), Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 5929–5940). New York, NY: ACM.CrossRefGoogle Scholar
  8. Schilit, B. N., & Theimer, M. M. (1994). Disseminating active map information to mobile hosts. IEEE Network, 8(5), 22–32.CrossRefGoogle Scholar
  9. Silman, F., Yaratan, H., & Karanfiller, T. (2017). Use of assistive technology for teaching-learning and administrative processes for the visually impaired people. Eurasia Journal of Mathematics, Science & Technology Education, 13(8), 4805–4813.CrossRefGoogle Scholar
  10. Thüs, H., Chatti, M. A., Yalcin, E., Pallasch, C., Kyryliuk, B., Mageramov, T., & Schroeder, U. (2012). Mobile learning in context. International Journal of Technology Enhanced Learning, 4(5-6), 332–344.CrossRefGoogle Scholar
  11. Tiwari, R. (Ed.). (2012). Intelligent planning for mobile robotics: algorithmic approaches: algorithmic approaches. Hershey, PA: IGI Global.Google Scholar
  12. Yelamarthi, K., Haas, D., Nielsen, D., & Mothersell, S. (2010, August). RFID and GPS integrated navigation system for the visually impaired. In 2010 53rd IEEE International Midwest Symposium on Circuits and Systems (MWSCAS) (pp. 1149–1152). Piscataway, NJ: IEEE.CrossRefGoogle Scholar
  13. Zöllner, M., Huber, S., Jetter, H. C., & Reiterer, H. (2011, September). NAVI–a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In IFIP Conference on Human-Computer Interaction (pp. 584–587). Berlin: Springer.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Kavi Kumar Khedo
    • 1
    Email author
  • Kishan Yashveer Bhugul
    • 1
  • David Young Ten
    • 1
  1. 1.Department of Digital Technologies, Faculty of Information, Communication and Digital TechnologiesUniversity of MauritiusReduitMauritius

Personalised recommendations