Abstract
This paper introduces an emotional sound edutainment system for children to learn basic musical composition called as musical education sound interactive game (MESIG) employing a new type of user interface. Developed interactive game interface provides children to enjoy the game, so that they learn how to compose musical notes with touching the tangible objectives instead of using ordinary input devices. This way on experiencing and playing the computer games has been evolved to use the body and hands’ movement so as to interact with the game in virtual environment, which brings out interest for the children and their learning capability becomes more effectively improved. This system introduced in this paper requires a single camera and carries out skin color model tracking function to detect hand gesture as input device for playing the game. This computer vision technique based on image processing makes possible to operate an expressive interactive musical education system. To exploit the effectiveness, evaluation and analysis works are accomplished upon the realization of sound edutainment game.
Chapter PDF
Similar content being viewed by others
References
Madej, K.: Towards digital narrative for children:Â from education to entertainment, a historical perspective. Comp. in Entertainment (CIE)Â 1(1) (2003)
Stromberg, H., et al.: A group game played in interactive virtual space:Â design and evaluation. Symposium on Designing Interactive Systems (2002)
Roussou, M.: Learning by doing and learning through play:Â an exploration of interactivity in virtual environments for children. Computer in Entertainment (CIE)Â 2(1) (2004)
Bobick, A., et al.: Perceptual user interfaces: The KidsRoom. Communication, ACM 43(3), 60–61 (2000)
Pausch, R., et al.: First Steps Toward Storytelling in Virtual Reality. In: Computer Graphics Proceedings, ACM SIGGRAPH. Annual Conference Series, pp. 193–203 (1996)
Talin: Real interactivity in interactive entertainment. In: Dodsworth Jr., C. (ed.) Digital Illusion: Entertaining the Future with High Technology. Addison-Wesley, Reading (1998)
Nardi, B.A.: Context and Consciousness. In: Activity Theory and Human-Computer Interaction. MIT Press, Cambridge (1996)
Kong, G., et al.: A real time system for robust 3d voxel reconstruction of human motion. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition., vol. 2, pp. 714–720 (2000)
Sukthankar, R., et al.: Smarter Presentations. Exploiting Homogra-phy in Camera-Projector Systems. In: Proceedings of International Conference on Computer Vision, vol. 1, pp. 247–253 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Park, M., Kim, K. (2009). Design of Interactive Emotional Sound Edutainment System. In: Jacko, J.A. (eds) Human-Computer Interaction. Interacting in Various Application Domains. HCI 2009. Lecture Notes in Computer Science, vol 5613. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02583-9_41
Download citation
DOI: https://doi.org/10.1007/978-3-642-02583-9_41
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02582-2
Online ISBN: 978-3-642-02583-9
eBook Packages: Computer ScienceComputer Science (R0)