Advertisement

Pervasive Assistive Technology for the Deaf-Blind Need, Emergency and Assistance Through the Sense of Touch

  • Nicholas CaporussoEmail author
  • Michelantonio Trizio
  • Giovanni Perrone
Chapter
Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

Deaf-blind people have some degree of combined impairments of both the visual and the auditory channels. In the scenario of sensory disabilities, deaf-blindness is among the worst cases. Fortunately, it only affects a small percentage of the population. Being a niche market, in turn, is one of the main reasons why spending in innovation is not considered profitable both by companies and by investors. As a result, there is poor assistive technology specifically designed for deaf-blind people. Indeed, deaf-blindness is a rare, challenging, demanding, and urgent condition. Although dealing with this type of disability can be complex, deaf-blind people have the very same needs as the sighted: independence, access to information, social integration. The objective of this work is to review currently available technology that is suitable for supporting the deaf-blind in their daily life. Specifically, we focus on systems that support basic functional communication in case of need, emergency, and assistance. Also, we discuss the main barriers to innovation in this niche market, and we introduce dbGLOVE, a low-cost solution that combines performance and acceptability.

Keywords

Global Position System Assistive Technology Niche Market Auditory Channel Vibrotactile Feedback 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    The World Health Organization. (2013). Visual impairment and blindness. Fact sheet N282. Retrieved October 22, 2013, from http://www.who.int/mediacentre/factsheets/fs282/en/
  2. 2.
    Freeman, P. (1975). Understanding the deaf/blind child. London: Heinemann Health Books.Google Scholar
  3. 3.
    Mc Innes, J., & Treffery, J. (1982). Deafblind infants and children: A developmental guide. Toronto: University of Toronto Press.Google Scholar
  4. 4.
    Ziefle, M., et al. (2014). Current trends and challenges for pervasive health technologies: From technical innovation to user integration. In Pervasive health: State-of-the-art & beyond.Google Scholar
  5. 5.
    Lewkowicz, D. J. (2000). The development of intersensory temporal perception: An epigenetic systems/limitations view. Psychological Bulletin, 126(2), 281–308. doi: 10.1037/0033-2909.126.2.281
  6. 6.
    Klatzky, R.-L., et al. (1985). Identifying objects by touch: An expert system. Perception and Psychophysics, 37, 299–302.CrossRefGoogle Scholar
  7. 7.
    Reed, C.-M., & Durlach, N.-I. (1998). Note on information transfer rates in human communication. Presence: Teleoperators and Virtual Environments, 7(5), 509–518.CrossRefGoogle Scholar
  8. 8.
    Ladner, R., et al. (1986). A user interface for deaf-blind people (preliminary report). SIGCHI Bulletin, 18, 81–92. doi: 10.1145/1165387.30864
  9. 9.
    Chang, A., et al. (2002). ComTouch: Design of a vibrotactile communication device. Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, June 25–28, 2002, London, England.Google Scholar
  10. 10.
    Bloomfield, A., & Badler, N.-I. (2007). Collision awareness using vibrotactile array. Proceedings of the Virtual Reality Conference (pp. 163–170).Google Scholar
  11. 11.
    Saba, M.-P., et al. (2003). Hey yaa: A haptic warning wearable to support deaf people communication. Lapa, Rio de Janeiro, Brazil: Escola Superior de Desenho Industrial.Google Scholar
  12. 12.
    CuteCircuit. (2008). HugShirt. Retrieved March 14, 2008, from http://cutecircuit.com/portfolio/hug-shirt/
  13. 13.
    Eid, M., et al. (2008). HugMe: A haptic videoconferencing system for interpersonal communication. Canada: Multimedia Communications Research Laboratory University of Ottawa.Google Scholar
  14. 14.
    Teh, J., et al. (20101). Huggy Pajama: A remote interactive touch and hugging system. Art and technology of entertainment computing and communication (pp. 161–194).Google Scholar
  15. 15.
    Runyana, N., & Blazieb, D. (2010). EAP actuators aid the quest for the “Holy Braille” of tactile displays, electroactive polymer actuators and devices. Proceedings of Electroactive Polymer Actuators and Devices (EAPAD), Vol. 7642.Google Scholar
  16. 16.
    The Boston-Based National Braille Press (NBP), Center for Braille Innovation. (2010). b2g (Braille to Go). Retrieved October, 2012 from http://www.nbp.org/ic/nbp/technology/braillepda.html
  17. 17.
    Jayant, C., et al. (2010). VBraille: Haptic Braille perception using a touch-screen and vibration on mobile phones. Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. Google Scholar
  18. 18.
    Gollner, U., et al. (2012). Mobile Lorm glove introducing a communication device for deaf-blind people. Proceedings of the sixth International Conference on Tangible, Embedded and Embodied Interaction. Google Scholar
  19. 19.
    Kobakant DIY Wearable Technology Documentation. (2012). Sensitive fingertips. Retrieved October 6, 2013, from http://www.kobakant.at/DIY/?p=531
  20. 20.
    Dinh, V. V., et al. (2010). Wireless haptic glove for language and information transference. United States Patent Application, Publication No.: 2010/0134327 A1.Google Scholar
  21. 21.
    Lisztes, A., et al. (2005). Sign language in the intelligent sensory environment. 2(1).Google Scholar
  22. 22.
    Hirose, M., & Amemiya, T. (2003). Wearable finger-Braille interface for navigation of deaf-blind in ubiquitous barrier-free space. Proceedings of the HCI International.Google Scholar
  23. 23.
    Caporusso, N. (2008). A wearable Malossi alphabet interface for deafblind people. Proceedings of the Working Conference on Advanced Visual Interfaces, ACM (pp. 445–448).Google Scholar
  24. 24.
    Sigafoos, J., et al. (2008). A review of intervention studies on teaching AAC to individuals who are deaf and blind. Journal of Developmental and Physical Disabilities, 20, 71–99 (Springer, US).Google Scholar
  25. 25.
    Langtao, L., & Balachandran, W. (1993). Electronic compass for blind or deaf-blind pedestrians. Electronics Letters, 9.Google Scholar
  26. 26.
    Nagel, S.-K., et al. (2005). Beyond sensory substitution—learning the sixth sense. Journal of Neural Engineering, 2(4), 13–26.CrossRefGoogle Scholar
  27. 27.
    GDP Research. (2010). Miniguide. Retrieved January 13, 2013, from http://www.gdp-research.com.au/
  28. 28.
    Sound Forsight. (2009). Sound foresight. Retrieved February 11, 2011, from http://www.soundforesight.co.uk/
  29. 29.
    Takes Corporation. (2009). Palmsonar and palmtip. Retrieved July 3, 2012, from http://www.palmsonar.com/
  30. 30.
    Step Hear. (2010). Step hear and call hear. Retrieved October 6, 2013, from http://www.step-hear.com/
  31. 31.
    Talking Signs. (2010). Talking signs, infrared communication system. Retrieved October 6, 2013, from http://www.talkingsigns.com/
  32. 32.
    Van Erp, J.-B.-F., et al. (2005). Waypoint navigation with a vibrotactile waist belt. ACM Transactions on Applied Perception, 2(2), 106–117.Google Scholar
  33. 33.
    Altini, M., et al. (2011). A cost-effective indoor vibrotactile navigation system for the blind. Proceedings of the International Conference on Health Informatics. Google Scholar
  34. 34.
    Azenkot, S., & Fortuna, E. (2010). Improving public transit usability for blind and deaf-blind people by connecting a Braille display to a smartphone. Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 317–318).Google Scholar
  35. 35.
    Ertan, S., et al. (1998). A wearable haptic navigation guidance system. Proceedings of the Second International Symposium on Wearable Computers (pp. 164–165).Google Scholar
  36. 36.
    Sylvain, C., et al. (2007). A wearable system for mobility improvement of visually impaired people. International Journal of Computer Graphics, 23(2), 109–118.Google Scholar
  37. 37.
    Tsukada, K., & Yasumura, M. (2005). ActiveBelt: Belt-type wearable tactile display for directional navigation. In Lecture notes in computer science. Springer-Verlag (Oct 2004).Google Scholar
  38. 38.
    Tsukada, K., & Yasumura, M. (2002). Ubi-finger: Gesture input device for mobile use. Retrieved November 12, 2012, from http://mobiquitous.com/pub/apchi2002-ubi-finger.pdf
  39. 39.
    Masui, T., & Siio, I. (2000). Real-world graphical user interfaces. Lecture Notes in Computer Science, 1927, 72–84.Google Scholar
  40. 40.
    Fifth Dimension Technologies. (2012). 5DT Data Glove 14 Ultra. Retrieved October 21, 2013, from http://www.5dt.com/products/pdataglove14.html
  41. 41.
    National Consortium on Deaf-Blindness. (2007). Children who are deaf-blind. OR: National Consortium on Deaf-Blindness.Google Scholar
  42. 42.
    United States Department of Education. (2011). Collective wisdom: An anthology of stories and best practices for educating students with severe disabilities and deaf-blindness.Google Scholar
  43. 43.
    Lieberman, L.-J., et al. (2012). Physical education and sports for people with visual impairments and deafblindness: Foundations of instruction. Sewickley, PA: AFB Press.Google Scholar
  44. 44.
    Hartmann, E.-S. (2011). Conceptualizing collaboration: How teachers work together to support children with deafblindness.Google Scholar
  45. 45.
    Perkins School for the Blind. (2013). Suggestions for modifying the home and school environment: A handbook for parents. Watertown: Perkins School for the Blind.Google Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  • Nicholas Caporusso
    • 1
    • 2
    Email author
  • Michelantonio Trizio
    • 1
    • 2
  • Giovanni Perrone
    • 1
    • 2
  1. 1.QIRISBariItaly
  2. 2.INTACT HealthcareBariItaly

Personalised recommendations