Advertisement

A Modality Replacement Framework for the Communication between Blind and Hearing Impaired People

  • Konstantinos Moustakas
  • Dimitrios Tzovaras
  • Laila Dybkjær
  • Niels Ole Bernsen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5616)

Abstract

This paper presents a multimodal framework for the communication between blind and hearing impaired people. The algorithms that are developed are based on the concept of modality replacement, which is the use of information originating from various modalities to compensate for the missing input modality of the system or the users. Spatial information is conveyd in the blind user’s terminal through the haptic modality utilizing an efficient haptic rendering scheme, while verbal information is presented in the hearing impaired user’s terminal through sign language synthesis. All technologies are integrated in a virtual treasure hunting game, where a blind and a hearing impaired user have to collaborate so as to navigate in the virtual environment and solve the riddle of the game. Usability evaluation of this framework has shown significant impact of the proposed system for the disabled users.

Keywords

Modality replacement multimodal interfaces haptics sign language virtual reality 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aran, O., Akarun, L.: Recognizing two handed gestures with generative, discriminative and ensemble methods via Fisher kernels. In: Gunsel, B., Jain, A.K., Tekalp, A.M., Sankur, B. (eds.) MRCS 2006. LNCS, vol. 4105, pp. 159–166. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Bologna, G., Deville, B., Pun, T., Vinckenbosch, M.: Transforming 3d coloured pixels into musical instrument notes for vision substitution applications. Eurasip Journal of Image and Video Processing, Special Issue: Image and Video Processing for Disability 24, 14–27 (2007)Google Scholar
  3. 3.
    Lumsden, J., Brewster, S.A.: A paradigm shift: Alternative interaction techniques for use with mobile & wearable devices. In: Proc. 13th Annual IBM Centers for Advanced Studies Conference CASCON 2003, Toronto, Canada, pp. 97–100 (2003)Google Scholar
  4. 4.
    Marsic, I., Medl, A., Flanagan, J.: Natural communication with information systems. Proc. of the IEEE 88(8), 1354–1366 (2000)CrossRefGoogle Scholar
  5. 5.
    Moustakas, K., Tzovaras, D., Strintzis, M.G.: SQ-Map: Efficient Layered Collision Detection and Haptic Rendering. IEEE Transactions on Visualization and Computer Graphics 13(1), 80–93 (2007)CrossRefGoogle Scholar
  6. 6.
    Moustakas, K., Nikolakis, G., Kostopoulos, K., Tzovaras, D., Strintzis, M.G.: Haptic rendering of visual data for the visually impaired. IEEE Multimedia Magazine 14(1), 62–72 (2007)CrossRefGoogle Scholar
  7. 7.
    Papadogiorgaki, M.G., Grammalidis, N., Makris, L., Strintzis, M.G.: Gesture synthesis from sign language notation using MPEG-4 humanoid animation parameters and inverse kinematics. In: 2nd International Conference on Intelligent Environments (IE 2006), Athens, Greece, July 5-6 (2006)Google Scholar
  8. 8.
    Raman, T.V.: Multimodal Interaction Design Principles For Multimodal Interaction. In: CHI 2003, pp. 5–10. Fort Lauderdale, USA (2003)Google Scholar
  9. 9.

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Konstantinos Moustakas
    • 1
  • Dimitrios Tzovaras
    • 1
  • Laila Dybkjær
    • 2
  • Niels Ole Bernsen
    • 2
  1. 1.Informatics and Telematics InstituteCentre for Research and Technology HellasThermi-ThessalonikiGreece
  2. 2.NISLabCopenhagenDenmark

Personalised recommendations