VRAS: A Virtual Rehearsal Assistant System for Live Performance

  • Yufeng Wu
  • Gangyi DingEmail author
  • Hongsong Li
  • Tong Xue
  • Di Jiao
  • Tianyu Huang
  • Longfei Zhang
  • Fuquan Zhang
  • Lin Xu
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 128)


The VR technologies are widely adopted for training purposes by providing the users with educational virtual experience. In this work, we propose an immersive VR system that help the choreographers and the dancers to facilitate their dance rehearsal experience. The system integrates motion capture devices and head-mounted displays (HMDs). The motions of the dancers, their partners, and the choreographers are captured and projected into a virtual dancing scene in an interactive frame rate. The dancers who are wearing the HMDs, are allowed to observe the synthesized virtual performances within a virtual stage space from several selected third-person views. These are the audience’s view, the dancing partner’s view, and the choreographer’s view. Such synthesized external self-images augment dancer’s perception of their dance performance and their understanding of the choreography. Feedbacks from the participants indicate the proposed system is effective and the preliminary experimental results agree with our observations.


Dance rehearsal Choreography Immersive virtual reality Motion capture 



The research leading to these results has received funding from National Natural Science Foundation of China under Grants No. 60975013, National Natural Science Foundation of China No. 61202243 Research on Realtime Motion Capture with Minimal Inertial Sensors and supported by the Major Project of Sichuan Province Key Laboratory of Digital Media Art under Grants No. 17DMAKL01 and supported by Fujian Province Guiding Project under Grants No. 2018H0028.


  1. 1.
    Azarbayejani, A., Wren, C., Pentl, A.: Real-time 3-D tracking of the human body. In: Proceedings of IMAGE’COM 1996, pp. 780–785 (1996)Google Scholar
  2. 2.
    Alexiadis, D.S., Kelly, P., Daras, P., et al.: Evaluating a dancer’s performance using kinect-based skeleton tracking, pp. 659–662 (2011)Google Scholar
  3. 3.
    Anderson, F., Grossman, T., Matejka, J., Fitzmaurice, G.: YouMove: enhancing movement training with an augmented reality mirror. In: Proceedings of UIST 2013, pp. 311–320. ACM, New York (2013)Google Scholar
  4. 4.
    Chan, J.C.P., Leung, H., Tang, J.K.T., et al.: A virtual reality dance training system using motion capture technology. IEEE Trans. Learn. Technol. 4(2), 187–195 (2011)CrossRefGoogle Scholar
  5. 5.
    Chua, P.T., Crivella, R., Bo, D., et al.: Training for physical tasks in virtual environments: Tai Chi. In: IEEE Virtual Reality, p. 87. IEEE Computer Society (2003)Google Scholar
  6. 6.
    Ho, E., Chan, J., Komura, T., Leung, H.: Interactive partner control in close interactions for realtime applications. ACM Trans. Multimed. Comput. Commun. Appl. 9(3), 21 (2013)CrossRefGoogle Scholar
  7. 7.
    Hallam, J., Keen, E., Lee, C., et al.: Ballet hero: building a garment for memetic embodiment in dance learning. In: Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program. pp. 49–54. ACM (2014)Google Scholar
  8. 8.
    Hachimura, K., Kato, H., Tamura, H.: A prototype dance training support system with motion capture and mixed reality technologies. In: IEEE International Workshop on Robot and Human Interactive Communication, pp. 217–222. IEEE (2004)Google Scholar
  9. 9.
    Han, P.H., Chen, K.W., Hsieh, C.H., et al.: AR-Arm: augmented visualization for guiding arm movement in the first-person perspective. In: Augmented Human International Conference, p. 31. ACM (2016)Google Scholar
  10. 10.
    Han, P.H., Chen, Y.S., Zhong, Y., et al.: My Tai-Chi coaches: an augmented-learning tool for practicing Tai-Chi Chuan. In: Augmented Human International Conference, p. 25. ACM (2017)Google Scholar
  11. 11.
    Kitsikidis, A., Dimitropoulos, K., Douka, S., et al.: Dance analysis using multiple kinect sensors. In: International Conference on Computer Vision Theory and Applications, pp. 789–795. IEEE, Hfh (2014)Google Scholar
  12. 12.
    Kyan, M., Sun, G., Li, H., et al.: An approach to ballet dance training through MS kinect and visualization in a CAVE virtual reality environment. ACM Trans. Intell. Syst. Technol. 6(2), 291–300 (2015)CrossRefGoogle Scholar
  13. 13.
    Uejou, M., Huang, H.-H., Lee, J.-H., Kawagoe, K.: Toward a conversational virtual instructor of ballroom dance. In: Intelligent Virtual Agents, pp. 477–478. Springer, Berlin (2011)Google Scholar
  14. 14.
    Marquardt, Z., Beira, J., Em, N., Paiva, I., Kox, S.: Super Mirror: a kinect interface for ballet dancers. In: Proceedings of CHI 2012 Extended Abstracts, pp. 1619–1624. ACM, New York (2012)Google Scholar
  15. 15.
    Niwayama, T., Nakamura, A., Tabata, S., et al.: Mobile robot system for easy dance training. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, pp. 2223–2228. IEEE Xplore (2004)Google Scholar
  16. 16.
    Sra, M., Schmandt, C.: MetaSpace: full-body tracking for immersive multiperson virtual reality. In: Adjunct Proceedings of the ACM Symposium on User Interface Software & Technology, pp. 47–48. ACM (2015)Google Scholar
  17. 17.
    Semwal, S.K., Hightower, R., Stansfield, S.: Mapping algorithms for real-time control of an avatar using eight sensors. Presence: Teleoper. Virtual Environ. 7(1), 1–21 (1998)CrossRefGoogle Scholar
  18. 18.
    Trajkova, M., Cafaro, F.: E-ballet: designing for remote ballet learning. In: ACM International Joint Conference, pp. 213–216 (2016)Google Scholar
  19. 19.
    Yan, S., et al.: OutsideMe. In: The ACM Conference Extended, pp. 965–970. ACM (2015)Google Scholar
  20. 20.
    Yang, U., Kim, G.J.: Implementation and evaluation of “just follow me”: an immersive, VR-based, motion-training system. Presence: Teleoper. Virtual Environ. 11(3), 304–323 (2002)CrossRefGoogle Scholar
  21. 21.
    Myroniv, B., Wu, C.-W., Ren, Y., Christian, A.B., Bajo, E., Tseng, Y.-C.: Analyzing user emotions via physiology signals. Data Sci. Pattern Recogn. 1(2), 11–25 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Yufeng Wu
    • 1
  • Gangyi Ding
    • 1
    Email author
  • Hongsong Li
    • 1
  • Tong Xue
    • 1
  • Di Jiao
    • 1
  • Tianyu Huang
    • 1
  • Longfei Zhang
    • 1
  • Fuquan Zhang
    • 1
    • 2
  • Lin Xu
    • 3
  1. 1.Digital Performance and Simulation Key Laboratory, School of Computer Science and TechnologyBeijing Institute of TechnologyBeijingChina
  2. 2.Fujian Provincial Key Laboratory of Information Processing and Intelligent ControlMinjiang UniversityFuzhouChina
  3. 3.Key Laboratory of Nondestructive TestingFuqing Branch of Fujian Normal UniversityFuzhouPeople’s Republic of China

Personalised recommendations