Skip to main content

HapAR: Handy Intelligent Multimodal Haptic and Audio-Based Mobile AR Navigation for the Visually Impaired

  • Chapter
  • First Online:
Technological Trends in Improved Mobility of the Visually Impaired

Part of the book series: EAI/Springer Innovations in Communication and Computing ((EAISICC))

Abstract

Visually impaired people have suffered greatly on finding the right direction toward their destination. This paper initiates an innovative and low-cost solution by providing mobile Augmented Reality (HapAR) that is capable to stimulate haptic and audio sensation for guiding them inside the campus. The direction is generated using the geo-location of the building and current position of the user. The initial testing was conducted inside the campus and successfully gives a promising result. They found the system was easy to use by pointing out the mobile devices and they can feel the vibration when the user is out of track and hear the voice assistant to correct their track.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Afif, F. N., Basori, A. H., & Saari, N. (2013). Vision based tracking technology for augmented reality: A survey. International Journal of Interactive Digital Media, 1(1), 46–49.

    Google Scholar 

  • Afif, F. N., & Basori, A. H. (2013). Orientation control for indoor virtual landmarks based on hybrid-based markerless augmented reality. Procedia - Social and Behavioral Sciences, 97(6), 648–655.

    Article  Google Scholar 

  • Albaqami, N. N., Allehaibi, K. H., & Basori, A. H. (2018). Augmenting pilgrim experience and safety with geo-location way finding and mobile augmented reality. International Journal of Computer Science and Network Security, 18(2), 23–32.

    Google Scholar 

  • Azuma, R. (1997). A survey of augmented reality. Presence, 6(4), 355–385.

    Article  Google Scholar 

  • Basori, A. H., Afif, F. N., Almazyad, A. S., Abujabal, H. A., Rehman, A., & Alkawaz, M. H. (2015). Fast markerless tracking for augmented reality in planar environment. 3D Resesearch, 6, 41. https://doi.org/10.1007/s13319-015-0072-5

    Article  Google Scholar 

  • Bhowmick, A. S., & Hazarika, M. (2017). An insight into assistive technology for the visually impaired and blind people: state-of-the-art and future trends. Journal of Multimodal User Interfaces, 11, 149–172. https://doi.org/10.1007/s12193-016-0235

    Article  Google Scholar 

  • Caudell, T. P., & Mizell, D. W. (1992). Augmented reality: An application of heads-up display technology to manual manufacturing processes. The 25th Hawaii international conference on system sciences (pp. 659–669). Honolulu, HI: University of Hawaii at Manoa.

    Google Scholar 

  • Coughlan, J. M., & Miele, J. (2017). AR4VI: AR as an accessibility tool for people with visual impairments. International Symposium on Mixed and Augmented Reality, 2017, 288–292. https://doi.org/10.1109/ISMAR-Adjunct.2017.89

    Article  Google Scholar 

  • Findlater, L., Stearns, L., Du, R., Oh, U., Ross, D., Chellappa, R., & Froehlich, J. (2015). Supporting everyday activities for persons with visual impairments through computer vision-augmented touch. In The 17th international ACM SIGACCESS conference on computers & accessibility (ASSETS ‘15) (pp. 383–384). New York, NY: ACM. https://doi.org/10.1145/2700648.2811381

    Chapter  Google Scholar 

  • Gleason, C., Fiannaca, A. J., Kneisel, M., Cutrell, M., & Morris, M. R. (2018). FootNotes: Geo-referenced audio annotations for nonvisual exploration. Proceedings of the ACM Interactive, Mobile, Wearable Ubiquitous Technology, 2(3), Article 109. https://doi.org/10.1145/3264919

    Article  Google Scholar 

  • Hartley, R., & Zisserman, A. (2003). Multiple view geometry in computer vision (2nd ed.). Cambridge: Cambridge University Press.

    MATH  Google Scholar 

  • Katz, B. F. G., Kammoun, S., Parseihian, G., Gutierrez, O., Brilhault, A., Auvray, M., … Jouffrais, C. (2012). NAVIG: Augmented reality guidance system for the visually impaired. Virtual Reality, 16(4), 253–269.

    Article  Google Scholar 

  • Lee, W.-M. (2010). Using the accelerometer on the iPhone, iPod touch. Retrieved 16 April, 2019, from http://www.devx.com/wireless/Article/44799

  • Lim, P. H. (2012). Gyroscope in smartphones. Retrieved 11 October, 2012, from http://www.mobile88.com/news/read.asp?file=/2012/4/21/20120421165938

  • Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1994). Augmented reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies, 2351, 282–292.

    Article  Google Scholar 

  • Perrois, J. A., Laviole, J., Briant, C., & Brock, A. (2018). Towards a multisensory augmented reality map for blind and low vision people: A participatory design approach. In Conference on human factors in computing systems, April 2018. Montréal, QC: ACM.

    Google Scholar 

  • Rolland, J. P., Baillot, Y., & Goon, A. A. (2001). A survey of tracking technology for virtual environments. Fundamentals of Wearable Computers and Augmented Reality, 1, 67–112.

    Google Scholar 

  • ShahSani, R. K., Ullah, S., & Rahman, S. U. (2017). Automated marker augmentation and path discovery in indoor navigation for visually impaired. In L. De Paolis, P. Bourdot, & A. Mongelli (Eds.), Augmented reality, virtual reality, and computer graphics. AVR 2017. Lecture Notes in Computer Science, Vol. 10324. New York, NY: Springer.

    Google Scholar 

  • Siltanen, S. (2012). Theory and applications of marker-based augmented reality. Espoo: VTT.

    Google Scholar 

  • Stearns, L., Findlater, L., & Froehlich, J. E. (2018). Design of an augmented reality magnification aid for low vision users. In The 20th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘18) (pp. 28–39). New York, NY: ACM. https://doi.org/10.1145/3234695.3236361

    Chapter  Google Scholar 

  • Tapu, R., Mocanu, B., & Zaharia, T. (2018). Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognition Letters. https://doi.org/10.1016/j.patrec.2018.10.031

  • Welch, G., & Foxlin, E. (2002). Motion tracking: No silver bullet, but a respectable arsenal. Computer Graphics and Applications, 22(6), 24–38.

    Article  Google Scholar 

  • Wingrave, C. A., Williamson, B., Varcholik, P., Rose, J., Miller, A., Charbonneau, E., … LaViola, J. J., Jr. (2010). The wiimote and beyond: Spatially convenient devices for 3D user interfaces. IEEE Computer Graphics and Applications, 30(2). https://doi.org/10.1109/MCG.2009.109

    Article  Google Scholar 

  • Yazdi, N., Ayazi, F., & Najafi, K. (1998). Micromachined inertial sensors. Proceedings of the IEEE, 86(8), 1640–1659.

    Article  Google Scholar 

  • Zhao, Y., Bennett, C. L., Benko, H., Cutrell, E., Holz, C., Morris, M. R., & Sinclair, M. (2018). Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. CHI conference on human factors in computing systems (CHI ‘18). Paper 116 (p. 14). New York, NY: ACM. https://doi.org/10.1145/3173574.3173690

    Book  Google Scholar 

  • Zhou, F., Duh, H. B.-L., & Billinghurst, M. (2008). Trends in augmented reality tracking , interaction and display: A review of ten years of ISMAR. The 7th international symposium on mixed and augmented reality (ISMAR 2008) (pp. 193–202). New York, NY: ACM & IEEE.

    Google Scholar 

Download references

Acknowledgments

This work was supported by the Deanship of Scientific Research (DSR), King Abdulaziz University, Jeddah Saudi Arabia. The authors, therefore, gratefully acknowledge the DSR technical and financial support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmad Hoirul Basori .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Basori, A.H. (2020). HapAR: Handy Intelligent Multimodal Haptic and Audio-Based Mobile AR Navigation for the Visually Impaired. In: Paiva, S. (eds) Technological Trends in Improved Mobility of the Visually Impaired. EAI/Springer Innovations in Communication and Computing. Springer, Cham. https://doi.org/10.1007/978-3-030-16450-8_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-16450-8_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-16449-2

  • Online ISBN: 978-3-030-16450-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics