Improvements in a Wearable Device for Sign Language Translation
Nowadays a commercial product for sign language translation is still not available. This paper presents our latest results towards this goal, presenting a functional prototype called Talking Hands. Talking Hands uses a data-glove to detect the hand movements of the user, and a smartphone application to gather all the data and translates them into voice, using a speech synthesizer. Talking Hands implements the most suitable solutions for a massive production without penalizing its reliability. This paper presents the improvements of the last prototype in terms of hardware, software and design, together with a preliminary analysis for the translation of dynamic gestures through this device.
KeywordsSign Language Recognition Deaf Data-glove Gesture recognition
This work is supported by Limix S.r.l. (www.limix.it). Limix is an Italian start-up and spin-off of the University of Camerino. The intellectual property of Talking Hands and its different parts (hardware, software, design) is of Limix S.r.l.
- 1.Perkins, R., Battle, T., Edgerton, J., Mcneill, J.: A survey of barriers to employment for individuals who are deaf. J. Am. Deaf. Rehabil. Assoc. 49(1), 66–85 (2015)Google Scholar
- 2.Kim, H., Lee, S., Lee, D., Choi, S., Ju, J., Myung, H.: Real-time human pose estimation and gesture recognition from depth images using superpixels and SVM classifier. Sensors (Switzerland) 15(6), 2410–12427 (2015)Google Scholar
- 4.Cooper, H., Pugeault, N., Bowden, R.: Reading the signs: a video based sign dictionary. In: IEEE International Conference Computer Vision workshops, ICCV 2011, Barcelona (2011)Google Scholar
- 9.Bajpai, D., Porov, U., Srivastav, G., Sachan, N.: Two way wireless data communication and American sign language translator glove for images text and speech display on mobile phone. In: Proceedings 2015 5th International Conference on Communication Systems and Network Technologies. CSNT 2015, pp. 578–585 (2015)Google Scholar
- 12.Seymour, M., Tsoeu, M.: A mobile application for South African Sign Language (SASL) recognition, pp. 1–5 (2015)Google Scholar
- 13.Kau, L.J., Su, W.L., Yu, P.J., Wei S.J.: A real-time portable sign language translation system. In: 2015 IEEE 58th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 1–4 (2015)Google Scholar
- 14.Devi, S., Deb, S.: Low cost tangible glove for translating sign gestures to speech and text in Hindi language. In: 3rd International Conference on Computational Intelligence & Communication Technology (CICT), pp. 1–5 (2017)Google Scholar
- 15.Pezzuoli, F., Corona, D., Corradini, M.L., Cristofaro, A.: Development of a wearable device for sign language translation. In: International Workshop on Human-Friendly Robotics (HFR2017), pp. 115–126 (2017)Google Scholar
- 17.Kouichi, M., Hitomi, T.: Gesture recognition using recurrent neural networks. In: ACM Conference on Human factors in computing systems: reaching through technology (1999)Google Scholar
- 18.Vogler, C.: American sign language recognition: reducing the complexity of the task with phoneme-based modeling and parallel hidden markov models. University of Pennsylvania (2003)Google Scholar
- 19.Li, X.: Gesture recognition based on fuzzy C-Means clustering algorithm. Department of Computer Science, The University of Tennessee, KnoxvilleGoogle Scholar
- 20.Nagi, J., et al.: Max-pooling convolutional neural networks for vision-based hand gesture recognition. In: 2011 International Conference on Signal and Image Processing and Applications (ICSIPA), pp. 342–347 (2011)Google Scholar