A depth-based Indian Sign Language recognition using Microsoft Kinect

Abstract

Recognition of sign language by a system has become important to bridge the communication gap between the abled and the Hearing and Speech Impaired people. This paper introduces an efficient algorithm for translating the input hand gesture in Indian Sign Language (ISL) into meaningful English text and speech. The system captures hand gestures through Microsoft Kinect (preferred as the system performance is unaffected by the surrounding light conditions and object colour). The dataset used consists of depth and RGB images (taken using Kinect Xbox 360) with 140 unique gestures of the ISL taken from 21 subjects, which includes single-handed signs, double-handed signs and fingerspelling (signs for alphabets and numbers), totaling to 4600 images. To recognize the hand posture, the hand region is accurately segmented and hand features are extracted using Speeded Up Robust Features, Histogram of Oriented Gradients and Local Binary Patterns. The system ensembles the three feature classifiers trained using Support Vector Machine to improve the average recognition accuracy up to 71.85%. The system then translates the sequence of hand gestures recognized into the best approximate meaningful English sentences. We achieved 100% accuracy for the signs representing 9, A, F, G, H, N and P.

This is a preview of subscription content, access via your institution.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13
Figure 14
Figure 15

References

  1. 1

    Biswas K K and Basu S K 2011 Gesture recognition using Microsoft Kinect\(\textregistered \). In: Proceedings of the 5th International Conference on Automation, Robotics and Applications (ICARA), Wellington, New Zealand, 6–8 December, pp. 100–103

  2. 2

    Parton B S 2006 Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence. J. Deaf Stud. Deaf Educ. 11(1): 94–101

    Article  Google Scholar 

  3. 3

    Indian Sign Language Research and Training Centre (ISLRTC) http://www.islrtc.nic.in/

  4. 4

    Viswanathan D M and Idicula S M 2015 Recent developments in Indian sign language recognition: an analysis. Int. J. Comput. Sci. Inf. Technol. 6(1): 289–293

    Google Scholar 

  5. 5

    Mitra S and Acharya T 2007 Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 37(3): 311–324

    Article  Google Scholar 

  6. 6

    Vijay P K, Suhas N N, Chandrashekhar C S and Dhananjay D K 2012 Recent developments in sign language recognition: a review. Int. J. Adv. Comput. Technol. 1(2): 2126.

    Google Scholar 

  7. 7

    Nagashree R N, Stafford M, Aishwarya G N, Beebi H A, Jayalakshmi M R and Krupa R R 2015 Hand gesture recognition using support vector machine. Int. J. Eng. Sci. 4(6): 42–46

    Google Scholar 

  8. 8

    Dardas N H and Georganas N D 2011 Real-time hand gesture detection and recognition using bag-of-features and support vector machine techniques. IEEE Trans. Instrum. Meas. 60(11): 3592–3607

    Article  Google Scholar 

  9. 9

    Yang C, Jang Y, Beh J, Han D and Ko H 2012 Recent developments in Indian sign language recognition: an analysis. In: Proceedings of the IEEE International Conference on Consumer Electronics (ICCE), pp. 297–298

  10. 10

    Padmavathi S, Saipreethy M and Valliammai V 2013 Indian sign language character recognition using neural networks. Int. J. Comput. Appl. (Special Issue: Recent Trends in Pattern Recognition and Image Analysis) (1): 40–45

  11. 11

    Madhuri S, Ranjna P and Soho A K 2014 Indian sign language recognition using neural networks and KNN classifiers. ARPN J. Eng. Appl. Sci. 9(8): 1255–1259

    Google Scholar 

  12. 12

    Kim K, Kim S K and Choi H I 2015 Depth based sign language recognition system using SVM. Int. J. Multimed. Ubiquitous Eng. 10(2): 75–86

    Article  Google Scholar 

  13. 13

    Wang Y and Yang R 2013 Real-time hand posture recognition based on hand dominant line using Kinect. In: Proceedings of the IEEE International Conference on Multimedia and Expo Workshops (ICMEW), San Jose, CA, USA, 15–19 July, pp. 1–4

  14. 14

    Ansari Z A and Harit G 2016. Nearest neighbor classification of Indian sign language gestures using Kinect camera. Sadhana 41(2): 161–182

    MathSciNet  Article  Google Scholar 

  15. 15

    Tiwari V, Anand V, Keskar A G and Satpute V R 2015 Sign language recognition through Kinect based depth images and neural network. In: Proceedings of the 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Kochi, pp. 194–198

  16. 16

    Wang X, Han T X and Yan S 2009 An HOG–LBP human detector with partial occlusion handling. In: Proceedings of the IEEE 12th International Conference on Computer Vision, pp. 32–39

  17. 17

    Halim Z and Abbas G 2015 Kinect-based sign language hand gesture recognition system for hearing- and speech-impaired: a pilot study of Pakistani Sign Language. Assist. Technol. J. 27(1): 34–43

    Article  Google Scholar 

  18. 18

    Nagarajan S and Subashini T S 2013 Static hand gesture recognition for sign language alphabets using edge oriented histogram and multi class SVM. Int. J. Comput. Appl. 82(4): 28–35

    Google Scholar 

  19. 19

    Ghotkar A S and Kharate G K 2015 Dynamic hand gesture recognition and novel sentence interpretation algorithm for Indian sign language using Microsoft Kinect sensor. J. Pattern Recognit. Res. 1: 24–38

    Article  Google Scholar 

  20. 20

    Van den Bergh M and Van Gool L 2011 Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In: Proceedings of the IEEE Workshop on Applications of Computer Vision (WACV), Kona, HI, pp. 66–72

  21. 21

    Mohandes M, Deriche M and Liu J 2014 Image-based and sensor-based approaches to Arabic Sign Language recognition. IEEE Trans. Hum. -Mach. Syst. 44(4): 551–557

    Article  Google Scholar 

  22. 22

    Agarwal A and Thakur M K 2013 Sign language recognition using Microsoft Kinect. In: Proceedings of the Sixth International Conference on Contemporary Computing (IC3), Noida, pp. 181–185. https://doi.org/10.1109/IC3.2013.6612186

  23. 23

    Fujimura K and Liu X 2006 Sign recognition using depth image streams. In: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR’06). https://doi.org/10.1109/FGR.2006.101

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to T Raghuveera.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Raghuveera, T., Deepthi, R., Mangalashri, R. et al. A depth-based Indian Sign Language recognition using Microsoft Kinect. Sādhanā 45, 34 (2020). https://doi.org/10.1007/s12046-019-1250-6

Download citation

Keywords

  • ISL gesture recognition
  • multi-class SVM
  • SURF
  • HOG
  • LBP
  • depth based
  • gesture translation