Multimedia Tools and Applications

, Volume 75, Issue 16, pp 9685–9706 | Cite as

Implementation of an interactive TV interface via gesture and handwritten numeral recognition



In this study, a Kinect controller was used to develop control software for interactive television (ITV) and interactive multimedia, thus enabling users to intuitively and conveniently play videos and perform interactive operations. Because it lacks a button controller, the proposed design can achieve a human–machine interaction effect. The interactive control system is divided into two parts: dynamic gesture and handwriting recognition. The Kinect sensor is used as an input device to recognize the dynamic gestures of users to achieve real-time interactive control. TV channels can also be selected automatically through the recognition of handwritten digits. Furthermore, a back-propagation neural network was used to complete handwriting recognition in space to achieve the optimal recognition rate.


Back-propagation neural network (BPNN) Feature extraction Gesture recognition Handwriting recognition interactive television (TV) Principal curves 


Conflict of interests

The authors declare that there is no conflict of interests regarding the publication of this article.


  1. 1.
    Abbas Q, Ahmad J, Bangyal WH (2010) Analysis of learning rate using BP algorithm for hand written digit recognition application. Proceeding of International Conference on Information and Emerging Technologies, KarachiCrossRefGoogle Scholar
  2. 2.
    Freeman WT, Weissman CD (1995) Television control by hand gesture. IEEE Intl. Wkshp. on Automatic Face and Gesture Recognition, ZurichGoogle Scholar
  3. 3.
    Gustafson S, Bierwirth D, Baudisch P (2010) Imaginary interfaces: spatial interaction with empty hands and without visual feedback. UIST’10 Proceedings of the 23nd annual ACM symposium on User interface software and technology, New YorkCrossRefGoogle Scholar
  4. 4.
    Hichkleym K, Hollan J (2008) Papiercraft: a gesture-based command system for interactive paper. J ACM Trans Comput-Hum Interact 14(4):1–31Google Scholar
  5. 5.
    Jacob RJK, Girouard A, Hirshfield LM, Horn MS, Shaer O, Solovey ET, Zigelbaum J (2008) Reality-based interaction: a framework for post-WIMP interface. Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, FlorenceCrossRefGoogle Scholar
  6. 6.
    Jeng TS, Lee CH, Chen C, Ma YP (2002) Interaction and social issues in human-centered reactive environment. Proceedings of the 7th International Conference on Computer Aided Architectural Design Research in Asia, Cyberjaya, ISBN 983-2473-42-X Google Scholar
  7. 7.
    Kegl B (1999) Principal curves: learning, design, and applications. PhD Thesis, Concordia UniversityGoogle Scholar
  8. 8.
    Kegl B, Krzyzak A (2002) Piecewise linear skeletonization using principal curves. IEEE Trans Pattern Anal Mach Intell 24(1):59–74CrossRefGoogle Scholar
  9. 9.
    Kegl B, Krzyzak A, Linder T, Zegger K (1998) A polygonal line algorithm for constructing principal curves. NIPS 501–507Google Scholar
  10. 10.
    Kuhlman LM (2009) Gesture mapping for interaction design: an investigative process for developing interactive gesture libraries. PhD Thesis, The Ohio State UniversityGoogle Scholar
  11. 11.
    Laurel B (1991) Computers as theatre. Addison-Wesley Publishing CompanyGoogle Scholar
  12. 12.
    Laurel B, Mountford J (1990) The art of human-computer interface design. Published by Addison-Wesley Longman, BostonGoogle Scholar
  13. 13.
    Lee CH (2002) 「Interactive Media」- A multimodal approach to interface design for human-computer interaction in digital design environments. Master of Thesis, Department of Architecture, National Cheng Kung UniversityGoogle Scholar
  14. 14.
    Memmel T, Reiterer H (2008) Model-based and prototyping-driven user interface specification to support collaboration and creativity. J Univ Comput Sci 14(19):3217–3235Google Scholar
  15. 15.
    Oviatt S (1999) Ten myths of multimodal interaction. Commun ACM 42(1):74–81CrossRefGoogle Scholar
  16. 16.
    Patsadu O, Nukoolkit C, Watanapa B (2012) Human gesture recognition using kinect camera. 2012 Ninth International Joint Conference on Computer Science and Software Engineering (JCSSE), 978-1-4673-1921-8Google Scholar
  17. 17.
    Ren Z, Meng J, Yuan J, Zhang Z (2011) Robust hand gesture recognition with kinect sensor. MM’11 Proceedings of the 19th ACM international conference on Multimedia 759–760Google Scholar
  18. 18.
    Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A, Cook M, Moore R (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124CrossRefGoogle Scholar
  19. 19.
    Song Y, Demirdjian D, Davis R (2012) Continuous body and hand gesture recognition for natural human-computer interaction. J ACM Trans Interact Intell Syst (TiiS) - Spec Issue Affect Interac Nat Environ 2(1): Article No. 5Google Scholar
  20. 20.
    Zhang J, Chen D, Kruger U (2008) Adaptive constraint K-segment principal curves for intelligent transportation systems. IEEE Trans Intell Transp Syst 9(4):666–677CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Department of Computer ScienceNational Taipei University of EducationTaipeiTaiwan

Personalised recommendations