Skip to main content

Development of a Wearable Device for Sign Language Translation

  • Conference paper
  • First Online:

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 7))

Abstract

A wearable device for sign language translation, called Talking Hands, is presented. It is composed by a custom data glove, which is designed to optimize the data acquisition, and a smartphone application, which offers user personalizations. Although Talking Hands can not translate a whole sign language, it offers an effective communication to deaf and mute people with everyone through a scenario-based translation. The different challenges of a gesture recognition system have been overcame with simple solutions, since the main goal of this work is an user-based product.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   79.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Khan, R.Z., Ibraheem, N.A.: Hand gesture recognition: a literature review. Int. J. Artif. Intell. Appl. 3(4) (2012)

    Google Scholar 

  2. Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C: Appl. Rev. 37(3) (2007)

    Google Scholar 

  3. Ong, S.C.W., Ranganath, S.: Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans. Pattern Anal. Mach. Intell. 27(6) (2005)

    Google Scholar 

  4. Vogler, C.: American sign language recognition: reducing the complexity of the task with phoneme-based modeling and parallel hidden Markov models. Ph.D. thesis, University of Pennsylvania (2003)

    Google Scholar 

  5. Cooper, H., Pugeault, N., Bowden, R.: Reading the signs: a video based sign dictionary. IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Nov 2011

    Google Scholar 

  6. Starner, T., Weaver, J., Pentland, A.: Real time American Sign Language recognition using desk and wearable computer based video. IEEE Trans. Pattern Anal. Mach. Intell. 20(12) (1998)

    Google Scholar 

  7. Kelly, D., McDonald, J., Markham, C.: A person independent system for recognition of hand postures used in sign language. Pattern Recognit. Lett. 31, 1359–1368 (2010)

    Google Scholar 

  8. Ho-Sub, Y., Jung, S., Younglae, J.B., Hyun, S.Y.: A person independent system for recognition of hand postures used in sign language. Pattern Recognit. 34, 1491–1501 (2001)

    Google Scholar 

  9. Parvini, F., McLeod, D., Shahabi, C., Navai, B., Zali, B., Ghandeharizadeh, S.: An Approach to Glove-Based Gesture Recognition, California 90089-0781 (2009)

    Google Scholar 

  10. Bukhari, J., Rehman, M., Malik, S.I., Kamboh, A.M., Salman, A.: American Sign Language translation through sensory glove; SignSpeak. Int. J. u- and e-Serv. Sci. Technol. 8(1), 131–142 (2015)

    Google Scholar 

  11. Akhmadeev, K.: A testing system for a real-time gesture classification using surface EMG. In: Preprints of the 20th World Congress, The International Federation of Automatic Control, Toulouse, France, 9–14 July 2017

    Google Scholar 

  12. Kouichi, M., Hitomi, T.: Gesture recognition using recurrent neural networks. In: ACM Conference on Human Factors in computing Systems: Reaching Through Technology (1999)

    Google Scholar 

  13. Xingyan, L.: Gesture Recognition Based on Fuzzy C-Means Clustering Algorithm. Departement of Computer Science, The University of Tennessee Knoxville

    Google Scholar 

  14. Huynh, D.Q.: Metrics for 3D rotations: comparison and analysis. J. Math. Imaging Vis. 35, 155–164 (2009)

    Google Scholar 

  15. Sensortec, B.: BNO055 Intelligent 9-axis absolute orientation sensor. https://cdn-shop.adafruit.com/datasheets/BST_BNO055_DS000_12.pdf

Download references

Acknowledgements

This work is supported by Limix S.r.l. (www.limix.it), an Italian start-up and spin-off of the University of Camerino. The intellectual property of Talking Hands and its different parts (hardware, software, design) is of Limix S.r.l.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dario Corona .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pezzuoli, F., Corona, D., Corradini, M.L., Cristofaro, A. (2019). Development of a Wearable Device for Sign Language Translation. In: Ficuciello, F., Ruggiero, F., Finzi, A. (eds) Human Friendly Robotics. Springer Proceedings in Advanced Robotics, vol 7. Springer, Cham. https://doi.org/10.1007/978-3-319-89327-3_9

Download citation

Publish with us

Policies and ethics