Advertisement

Computational Visual Media

, Volume 4, Issue 3, pp 197–208 | Cite as

Dance to the beat: Synchronizing motion to audio

  • Rachele Bellini
  • Yanir Kleiman
  • Daniel Cohen-Or
Open Access
Research Article
  • 110 Downloads

Abstract

In this paper we introduce a video post-processing method that enhances the rhythm of a dancing performance, in the sense that the dancing movements are more in time to the beat of the music. The dancing performance as observed in a video is analyzed and segmented into motion intervals delimited by motion beats. We present an image-space method to extract the motion beats of a video by detecting frames at which there is a significant change in direction or motion stops. The motion beats are then synchronized with the music beats such that as many beats as possible are matched with as little as possible time-warping distortion to the video. We show two applications for this cross-media synchronization: one where a given dance performance is enhanced to be better synchronized with its original music, and one where a given dance video is automatically adapted to be synchronized with different music.

Keywords

video processing synchronization motion segmentation video analysis 

Supplementary material

41095_2018_115_MOESM1_ESM.mp4 (23.1 mb)
Dance to the Beat: Enhancing Dancing Performance in Video

References

  1. [1]
    Repp, B. H. Musical synchronization. In: Music, Motor Control and the Brain. Altenmuller, E.; Wiesendanger, M.; Keselring, J. Eds. Oxford University Press, 55–76, 2006.CrossRefGoogle Scholar
  2. [2]
    Kim, T.-h.; Park, S. I.; Shin, S. Y. Rhythmicmotion synthesis based on motion-beat analysis. ACM Transactions on Graphics Vol. 22, No. 3, 392–401, 2003.CrossRefGoogle Scholar
  3. [3]
    Shiratori, T.; Nakazawa, A.; Ikeuchi, K. Dancing-tomusic character animation. Computer Graphics Forum Vol. 25, No. 3, 449–458, 2006.CrossRefGoogle Scholar
  4. [4]
    Chu, W.-T.; Tsai, S.-Y. Rhythm of motion extraction and rhythm-based cross-media alignment for dance videos. IEEE Transactions on Multimedia Vol. 14, No. 1, 129–141, 2012.MathSciNetCrossRefGoogle Scholar
  5. [5]
    Flash, T.; Hogan, N. The coordination of arm movements: An experimentally confirmed mathematical model. Journal of Neuroscience Vol. 5, No. 7, 1688–1703, 1985.CrossRefGoogle Scholar
  6. [6]
    Jones, M. R.; Boltz, M. Dynamic attending and responses to time. Psychological Review Vol. 96, No. 3, 459–491, 1989.CrossRefGoogle Scholar
  7. [7]
    Leyvand, T.; Cohen-Or, D.; Dror, G.; Lischinski, D. Digital face beautification. In: Proceedings of the ACM SIGGRAPH 2006 Sketches, Article No. 169, 2006.CrossRefGoogle Scholar
  8. [8]
    Zhou, S.; Fu, H.; Liu, L.; Cohen-Or, D.; Han, X. Parametric reshaping of human bodies in images. ACM Transactions on Graphics Vol. 29, No. 4, Article No. 126, 2010.Google Scholar
  9. [9]
    Zhou, F.; Torre, F. Canonical time warping for alignment of human behavior. In: Proceedings of the Advances in Neural Information Processing Systems, 2286–2294, 2009.Google Scholar
  10. [10]
    Zhou, F.; De la Torre, F. Generalized time warping for multi-modal alignment of human motion. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1282–1289, 2012.Google Scholar
  11. [11]
    Yoon, J.-C.; Lee, I.-K.; Byun, S. Automated music video generation using multi-level feature-based segmentation. In: Handbook of Multimedia for Digital Entertainment and Arts. Furht, B. Ed. Springer, 385–401, 2009.CrossRefGoogle Scholar
  12. [12]
    Yoon, J.-C.; Lee, I.-K.; Lee, H.-C. Feature-based synchronization of video and background music. In: Advances in Machine Vision, Image Processing, and Pattern Analysis. Lecture Notes in Computer Science, Vol. 4153. Zheng, N.; Jiang, X.; Lan, X. Eds. Springer Berlin Heidelberg, 205–214, 2006.CrossRefGoogle Scholar
  13. [13]
    Jehan, T.; Lew, M.; Vaucelle, C. Cati dance: Self-edited, self-synchronized music video. In: Proceedings of the ACM SIGGRAPH 2003 Sketches & Applications, 1–1, 2003.Google Scholar
  14. [14]
    Suwajanakorn, S.; Seitz, S. M.; Kemelmacher-Shlizerman, I. Synthesizing Obama: Learning lip sync from audio. ACM Transactions on Graphics Vol. 36, No. 4, Article No. 95, 2017.Google Scholar
  15. [15]
    Caspi, Y.; Irani, M. Spatio-temporal alignment of sequences. IEEE Transactions on Pattern Analysis and Machine Intelligence Vol. 24, No. 11, 1409–1424, 2002.CrossRefGoogle Scholar
  16. [16]
    Slaney, M.; Covell, M. FaceSync: A linear operator for measuring synchronization of video facial images and audio tracks. In: Proceedings of the 13th International Conference on Neural Information Processing Systems, 784–790, 2000.Google Scholar
  17. [17]
    Wang, O.; Schroers, C.; Zimmer, H.; Gross, M.; Sorkine-Hornung, A. VideoSnapping: Interactive synchronization of multiple videos. ACM Transactions on Graphics Vol. 33, No. 4, Article No. 77, 2014.Google Scholar
  18. [18]
    Lu, S.-P.; Zhang, S.-H.; Wei, J.; Hu, S.-M.; Martin, R. R. Timeline editing of objects in video. IEEE Transactions on Visualization and Computer Graphics Vol. 19, No. 7, 1218–1227, 2013.CrossRefGoogle Scholar
  19. [19]
    Shiratori, T.; Nakazawa, A.; Ikeuchi, K. Detecting dance motion structure through music analysis. In: Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, 857–862, 2004.Google Scholar
  20. [20]
    Denman, H.; Doyle, E.; Kokaram, A.; Lennon, D.; Dahyot, R.; Fuller, R. Exploiting temporal discontinuities for event detection and manipulation in video streams. In: Proceedings of the 7th ACM SIGMM International Workshop on Multimedia Information Retrieval, 183–192, 2005.Google Scholar
  21. [21]
    McKinney, M. F.; Moelants, D.; Davies, M. E. P.; Klapuri, A. Evaluation of audio beat tracking and music tempo extraction algorithms. Journal of New Music Research Vol. 36, No. 1, 1–16, 2007.CrossRefGoogle Scholar
  22. [22]
    Ellis, D. P. W. Beat tracking by dynamic programming. Journal of New Music Research Vol. 36, No. 1, 51–60, 2007.CrossRefGoogle Scholar
  23. [23]
    Patel, A. D.; Iversen, J. R.; Bregman, M. R.; Schulz, I. Experimental evidence for synchronization to a musical beat in a nonhuman animal. Current Biology Vol. 19, No. 10, 827–830, 2009.CrossRefGoogle Scholar
  24. [24]
    Patel, A. D.; Iversen, J. R.; Bregman, M. R.; Schulz, I. Studying synchronization to a musical beat in nonhuman animals. Annals of the New York Academy of Sciences Vol. 1169, No. 1, 459–469, 2009.CrossRefGoogle Scholar
  25. [25]
    Patel, A. D.; Iversen, J. R.; Bregman, M. R.; Schulz, I.; Schulz, C. Investigating the human-specificity of synchronization to music. In: Proceedings of the 10th International Conference on Music Perception and Cognition, 100–104, 2008.Google Scholar

Copyright information

© The Author(s) 2018

Authors and Affiliations

  • Rachele Bellini
    • 1
  • Yanir Kleiman
    • 1
  • Daniel Cohen-Or
    • 1
  1. 1.Tel Aviv UniversityTel AvivIsrael

Personalised recommendations