An Innovative Real-Time Mobile Augmented Reality Application in Arts

  • Chutisant KerdvibulvechEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10325)


Due to the popularity of music, motion tracking and augmented reality in recent years, the research topic in these fields is extremely popular. In contrast to every previous work, in this paper, we present an innovative real-time mobile application in arts for helping musicians by integrating motion tracking into augmented reality technology. A kinematic filtering algorithm is utilized for calculating the parameters. According to the computed parameters, each hand of musicians is then tracked by using the Microsoft Kinect. After that, an augmented reality application with an integrated multimedia feature is built based on the PixLive Maker synchronized with music being played. This hybrid application allows musicians to interact with the virtual piano in a new way that is similar to the way in which they are playing a real piano. By pressing any selected piano key on the air, the sound of each note is generated continuously into a song and incorporated with an interface in the smartphone. The new application achieves a suitable computed rate for real-time use. Representative experimental results have shown that the application is beneficial for piano players in arts by allowing them to practice and touch the virtual piano with lower cost and an interactive experience.


Motion tracking Augmented reality Kinematic filtering Microsoft Kinect Multimedia Musical instrument Arts Virtual piano Real-time 



This research presented herein was partially supported by a research grant from the Research Center, NIDA (National Institute of Development Administration).


  1. 1.
    Kerdvibulvech, C., Saito, H.: Real-time guitar chord recognition system using stereo cameras for supporting guitarists. Trans. Electr. Eng. Electron. Commun. (ECTI) 5(2), 147–157 (2007)Google Scholar
  2. 2.
    Chow, J., Feng, H., Amor, R., Wunsche, B.C.: Music education using augmented reality with a head mounted display. In: Proceedings of the Fourteenth Australasian User Interface Conference, Melbourne, Australia, 29 January–01 February 2013, vol. 139, pp. 73–79 (2013)Google Scholar
  3. 3.
    Chouvatut, V., Jindaluang, W.: Virtual piano with real-time interaction using automatic marker detection. In: Proceedings of the International Computer Science and Engineering Conference (ICSEC), 4–6 September 2013, INSPEC Accession Number: 14022228. IEEE (2013)Google Scholar
  4. 4.
    Liang, H., Wang, J., Sun, Q., Liu, Y.-J., Yuan, J., Luo, J., He, Y.: Barehanded music: real-time hand interaction for virtual piano. In: Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games (I3D), Redmond, Washington, 27–28 February 2016, pp. 87–94 (2016)Google Scholar
  5. 5.
    Oka, A., Hashimoto, M.: Markerless piano fingering recognition using sequential depth images. In: Proceedings of the 19th Korea-Japan Joint Workshop on Frontiers of Computer Vision, pp. 1–4, January 2013Google Scholar
  6. 6.
    Goodwin, A., Green, R.: Key detection for a virtual piano teacher. In: Proceedings of the 28th International Conference of Image and Vision Computing New Zealand (IVCNZ), 27–29 November 2013, INSPEC Accession Number: 14062333. IEEE (2013)Google Scholar
  7. 7.
    Deb, S.S., Rajwade, A.: An image analysis approach for transcription of music played on keyboard-like instruments. In: Proceedings of the Tenth Indian Conference on Computer Vision, Graphics and Image Processing (ICVGIP), Guwahati, Assam, India, 18–22 December 2016. Article No. 5Google Scholar
  8. 8.
    Randolph, D.A., Di Eugenio, B.: Dactylize: automatically collecting piano fingering data from performance. In: Proceedings of the Extended Abstracts for the Late-Breaking Demo Session of the 17th International Society for Music Information Retrieval Conference, August 2016Google Scholar
  9. 9.
    Zhang, T., Lu, J., Hu, F., F., Wu, F., Guo, M.: A sensor-based virtual piano biofeedback system for stroke rehabilitation. In: Proceedings of the Global Humanitarian Technology Conference (GHTC), 10–13 October 2014, INSPEC Accession Number: 14789792. IEEE (2014)Google Scholar
  10. 10.
    Dubnov, T., Wang, C.: Free-body gesture tracking and augmented reality improvisation for floor and aerial dance. arXiv:1509.04751v1, 15 September 2015
  11. 11.
    Wang, Q., Kurillo, G., Ofli, F., Bajcsy, R.: Evaluation of pose tracking accuracy in the first and second generations of microsoft kinect. In: Proceedings of the International Conference on Healthcare Informatics (ICHI), pp. 380–389 (2015)Google Scholar
  12. 12.
    Ofli, F., Kurillo, G., Obdrzalek, S., Bajcsy, R., Jimison, H.B., Pavel, M.: Design and evaluation of an interactive exercise coaching system for older adults: lessons learned. IEEE J. Biomed. Health Inform. (JBHI) 20(1), 201–212 (2016). doi: 10.1109/JBHI.2015.2391671. INSPEC Accession Number: 15673370. IEEECrossRefGoogle Scholar
  13. 13.
    Kerdvibulvech, C., Wang, C.-C.: A new 3D augmented reality application for educational games to help children in communication interactively. In: Gervasi, O., et al. (eds.) ICCSA 2016. LNCS, vol. 9787, pp. 465–473. Springer, Cham (2016). doi: 10.1007/978-3-319-42108-7_35 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Graduate School of Communication Arts and Management InnovationNational Institute of Development AdministrationBangkokThailand

Personalised recommendations