Hand Gesture-Based Stable PowerPoint Presentation Using Kinect

  • Praveen KumarEmail author
  • Anurag Jaiswal
  • B. Deepak
  • G. Ram Mohana Reddy
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 518)


Recent trends in the development of interactive devices provide a better human–computer interaction (HCI) experience in various domains, e.g., academics, corporate world, teaching-assistant tools, and gaming. Presently sensors and camera-based applications are the area of interest of many researchers. In this paper, we mainly focus on developing natural hand gesture-based presentation tool for controlling PowerPoint presentation with more efficiency and stability. In order to provide stability during presentation using Microsoft Kinect, this paper proposes a novel idea of locking and unlocking for recognition of gestures. A comparative study has been done by considering three parameters namely viewer’s interest, frequency of movements, and overall stability during presentation. Results show that gesture-based presentation system provides better experience to both presenter and viewer as compared to traditional system. Further, there is no need of any extra device such as mouse or keyboard while presenting. This tool can also be used for development of other related applications.


Human–computer interaction (HCI) Hand gesture recognition Kinect depth sensor 


  1. 1.
    Kolbjrnsen, Morten. “A Comparison of Motion-Sensing Game Technologies for use in Physical Rehabilitation.” (2012).Google Scholar
  2. 2.
    Li, Yi. “Multi-scenario gesture recognition using Kinect.” Computer Games (CGAMES), 2012 17th International Conference on. IEEE, 2012.Google Scholar
  3. 3.
    Panwar, Meenakshi. “Hand gesture recognition based on shape parameters.” Computing, Communication and Applications (ICCCA), 2012 International Conference on. IEEE, 2012.Google Scholar
  4. 4.
    Rashid, Omer, Ayoub Al-Hamadi, and Bernd Michaelis. “A framework for the integration of gesture and posture recognition using HMM and SVM.” Intelligent Computing and Intelligent Systems, 2009. ICIS 2009. IEEE International Conference on. Vol. 4. IEEE, 2009.Google Scholar
  5. 5.
    Van o, Marek, Ivan Minrik, and Gregor Rozinaj. “Gesture identification for system navigation in 3D scene.” ELMAR, 2012 Proceedings. IEEE, 2012.Google Scholar
  6. 6.
    Kinect PowerPoint Control link:
  7. 7.
    Rufener, Sharon L. (May 26, 1986), “Harvard Graphics Is Easy to Learn and Use”, InfoWorld, pp. 4748.Google Scholar
  8. 8.
    Rogers, Lee F. “PowerPointing.” American Journal of Roentgenology 177.5 (2001): 973–973.Google Scholar
  9. 9.
    Baudel, Thomas, and Michel Beaudouin-Lafon. “Charade: remote control of objects using free-hand gestures.” Communications of the ACM 36.7 (1993): 28–35.Google Scholar
  10. 10.
    Sukthankar, Rahul, Robert G. Stockton, and Matthew D. Mullin. “Self-calibrating camera-assisted presentation interface.” Proceedings of International Conference on Automation, Control, Robotics and Computer Vision. Vol. 2000.Google Scholar
  11. 11.
    Cassell, Justine. “A framework for gesture generation and interpretation.” Computer vision in human-machine interaction (1998): 191–215.Google Scholar
  12. 12.
    Miles, Rob. Start Here! Learn the Kinect API. Pearson Education, 2012.Google Scholar
  13. 13.
    [Untitled Photograph of Kinect Sensors on Human Body] Retrieved March 10, 2013 from:

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  • Praveen Kumar
    • 1
    Email author
  • Anurag Jaiswal
    • 1
  • B. Deepak
    • 1
  • G. Ram Mohana Reddy
    • 1
  1. 1.National Institute of Technology KarnatakaSurathkal, MangaloreIndia

Personalised recommendations