Smart Universities: Gesture Recognition Systems for College Students with Disabilities

  • Jeffrey P. BakkenEmail author
  • Nivee Varidireddy
  • Vladimir L. Uskov
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 188)


In a highly technological society, smart universities, smart classrooms, and smart education are the wave of the future. Of many distinct features, one of those is its ability of adaptation to and smooth accommodation of various types of learners/students in on-campus classrooms as well as for students with remote/online access. These types of environments can benefit students/learners, regular students, and special students, i.e., students with various types of disabilities including physical, visual, hearing, speech, cognitive, and other types of impairments. This paper presents the outcomes of an ongoing research project aimed at systematic identification, analysis and testing of available open-source and commercial gesture recognition systems rating those that could significantly benefit college students with disabilities in highly technological environments. Based on a careful analysis of open-source and commercially available products, we identified and recommended the top gesture recognition systems for implementation in smart universities.


Gesture recognition Students with disabilities Software systems Smart university 


  1. 1.
    U.S. Department of Education, National Center for Education Statistics: Digest of Education Statistics, 2015 (2016-014), Chapter 3 (2016).
  2. 2.
    Bakken, J.P., Uskov, V.L, Kuppili, S.V., Uskov, A.V., Golla, N., Rayala, N.: Smart university: software systems for students with disabilities. In: Uskov, V.L., Bakken, J.P., Howlett, R.J., Jain, L.C. (eds.) Smart Universities: Concepts, Systems, and Technologies, pp. 87–128. Springer (2017). 425 p., ISBN: 978-3-319-59453-8Google Scholar
  3. 3.
    Bakken, J.P., Uskov, V.L., et al.: Smart university: software/hardware systems for college students with severe motion/mobility issues. In: Uskov, V.L., Howlett, R.J., Jain, L.C. (Eds.) Smart Education and e-Learning 2019, pp. 471–487. Springer (2019). 643 p., ISBN: 978-981-13-8260-4Google Scholar
  4. 4.
    Bakken, J.P., Uskov, V.L. et al.: Analysis and classification of university centers for students with disabilities. In: Uskov, V.L., Howlett, R.J., Jain, L.C. (Eds.) Smart Education and e-Learning 2019, pp. 445–459. Springer, June 2019. 643 p., ISBN: 978-981-13-8260-4Google Scholar
  5. 5.
    Bakken, J.P., Uskov, V.L, Penumatsu, A., Doddapaneni, A. Smart Universities, Smart Classrooms, and Students with Disabilities. In: Uskov, V.L., Howlett, R.J., Jain, L.C. (Eds.) Smart Education and e-Learning 2016, pp. 15–27. Springer, June 2016, 643 p., ISBN: 978-3-319-39689-7Google Scholar
  6. 6.
    Bakken, J.P., Uskov, V.L. et al.: Text-to-voice and voice-to-text software systems and students with disabilities: a research synthesis. In: Uskov, V.L., Howlett, R.J., Jain, L.C. (Eds.) Smart Education and e-Learning 2019, pp. 511–524. Springer, June, 643, p., ISBN: 978-981-13-8260-4 (2019)Google Scholar
  7. 7.
    Uskov, V.L, Bakken, J.P., Pandey, A., Singh, U., Yalamanchili, M., Penumatsu, A.: Smart university taxonomy: features, components, systems. In: Uskov, V.L., Howlett, R.J., Jain, L.C. (Eds.) Smart Education and e-Learning 2016, pp. 3–14. Springer, June 2016, 643 p. (2016)Google Scholar
  8. 8.
    Ma, Y., Zhou, G., Wang, S., Zhao, H., Jung, W.: SignFi: Sign Language Recognition Using WiFi. Computer Science Department, College of William and Mary, USA (2018).
  9. 9.
  10. 10.
    Paudyal, P., Lee, J., Banerjee, A., Gupta, S.K.S.: Sceptre-A Pervasive, Non-Invasive, and Programmable Gesture Recognition Technology, IMPACT Lab, Arizona State University.
  11. 11.
    Chang, S.M.: Using Gesture Recognition to Control PowerPoint Using the Microsoft Kinect. Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (2013).
  12. 12.
    Venkatnarayan, R.H., Shahzad, M.: Gesture Recognition Using Ambient Light, North Carolina State University.
  13. 13.
    Kenyon, R.V.: The CAVE Automatic Virtual Environment: Characteristics and Applications. University of Illinois—Chicago.
  14. 14.
    Turk, M.: Gesture Recognition, Computer Science Department, University of California
  15. 15.
    Fang, C.-Y., Kuo, M.-H., Lee G.-C., Chen, S.-W.: Student Gesture Recognition System in Classroom 2.0, Department of Computer Science and Information Engineering, National Taiwan Normal University.
  16. 16.
    Robert, W., Jagodzinski, P.: Virtual Laboratory—Using a Hand Movement Recognition System to Improve the Quality of Chemical Education (2017).
  17. 17.
    Truong, H., Nguyen, P., Bui, N., Nguyen, A., Vu, T.: DEMO: Low-Power Capacitive Sensing Wristband for Hand Gesture Recognition. University of Colorado Boulder (2017).
  18. 18.
    Wilson, D., Wilson, A.: Gesture Recognition Using the XWand. Assistive Intelligent Environments Group, Robotics Institute, Carnegie Mellon University.
  19. 19.
    Sang, Y., Wang, Q.: Micro Hand Gesture Recognition System Using Ultrasonic Active Sensing Method. Tsinghua University (2016).
  20. 20.
    Barmaki, R., Hughes, C.E.: Towards the Understanding of Gestures and Vocalization Coordination in Teaching Context (2016).
  21. 21.
    Chen, Z., Feng, X., Liu, T., Wang, C., Zhang, C.: A Computer-assisted Teaching System with Gesture Recognition Technology and Its Applications, National Engineering Research center for E-Learning, Central China Normal University (2017).

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Jeffrey P. Bakken
    • 1
    Email author
  • Nivee Varidireddy
    • 2
  • Vladimir L. Uskov
    • 2
  1. 1.The Graduate SchoolBradley UniversityPeoriaUSA
  2. 2.Department of Computer Science and Information Systems and InterLabs Research InstituteBradley UniversityPeoriaUSA

Personalised recommendations