Abstract
This paper presents a multimodal framework for the communication between blind and hearing impaired people. The algorithms that are developed are based on the concept of modality replacement, which is the use of information originating from various modalities to compensate for the missing input modality of the system or the users. Spatial information is conveyd in the blind user’s terminal through the haptic modality utilizing an efficient haptic rendering scheme, while verbal information is presented in the hearing impaired user’s terminal through sign language synthesis. All technologies are integrated in a virtual treasure hunting game, where a blind and a hearing impaired user have to collaborate so as to navigate in the virtual environment and solve the riddle of the game. Usability evaluation of this framework has shown significant impact of the proposed system for the disabled users.
Chapter PDF
Similar content being viewed by others
References
Aran, O., Akarun, L.: Recognizing two handed gestures with generative, discriminative and ensemble methods via Fisher kernels. In: Gunsel, B., Jain, A.K., Tekalp, A.M., Sankur, B. (eds.) MRCS 2006. LNCS, vol. 4105, pp. 159–166. Springer, Heidelberg (2006)
Bologna, G., Deville, B., Pun, T., Vinckenbosch, M.: Transforming 3d coloured pixels into musical instrument notes for vision substitution applications. Eurasip Journal of Image and Video Processing, Special Issue: Image and Video Processing for Disability 24, 14–27 (2007)
Lumsden, J., Brewster, S.A.: A paradigm shift: Alternative interaction techniques for use with mobile & wearable devices. In: Proc. 13th Annual IBM Centers for Advanced Studies Conference CASCON 2003, Toronto, Canada, pp. 97–100 (2003)
Marsic, I., Medl, A., Flanagan, J.: Natural communication with information systems. Proc. of the IEEE 88(8), 1354–1366 (2000)
Moustakas, K., Tzovaras, D., Strintzis, M.G.: SQ-Map: Efficient Layered Collision Detection and Haptic Rendering. IEEE Transactions on Visualization and Computer Graphics 13(1), 80–93 (2007)
Moustakas, K., Nikolakis, G., Kostopoulos, K., Tzovaras, D., Strintzis, M.G.: Haptic rendering of visual data for the visually impaired. IEEE Multimedia Magazine 14(1), 62–72 (2007)
Papadogiorgaki, M.G., Grammalidis, N., Makris, L., Strintzis, M.G.: Gesture synthesis from sign language notation using MPEG-4 humanoid animation parameters and inverse kinematics. In: 2nd International Conference on Intelligent Environments (IE 2006), Athens, Greece, July 5-6 (2006)
Raman, T.V.: Multimodal Interaction Design Principles For Multimodal Interaction. In: CHI 2003, pp. 5–10. Fort Lauderdale, USA (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Moustakas, K., Tzovaras, D., Dybkjær, L., Bernsen, N.O. (2009). A Modality Replacement Framework for the Communication between Blind and Hearing Impaired People. In: Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Applications and Services. UAHCI 2009. Lecture Notes in Computer Science, vol 5616. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02713-0_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-02713-0_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02712-3
Online ISBN: 978-3-642-02713-0
eBook Packages: Computer ScienceComputer Science (R0)