Advertisement

Comparing the Timing of Movement Events for Air-Drumming Gestures

  • Luke Dahl
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9617)

Abstract

New air-instruments allow us to control sound by moving our bodies in space without manipulating a physical object. However when we want to trigger a discrete sound at a precise time, for example by making a drumming gesture, the timing feels wrong. This work aims to understand what aspects of a performer’s movement correspond to their subjective sense of when the sound should occur. A study of air-drumming gestures was conducted, and the timing of eight movement events based on movements of the hand, wrist, elbow joint, and wrist joint are examined. In general, it is found that movement events based on peaks in acceleration are better because they occur earlier and have less noise than do events based on changes of direction.

Keywords

Gesture Musical gesture Air-instruments Air-drumming New interfaces for musical expression 

Notes

Acknowledgments

This research was performed as part of my PhD thesis at CCRMA, Stanford University.

References

  1. 1.
    Aschersleben, G.: Temporal control of movements in sensorimotor synchronization. Brain Cogn. 48(1), 66–79 (2002)CrossRefGoogle Scholar
  2. 2.
    Collicutt, M., Casciato, C., Wanderley, M.M.: From real to virtual: a comparison of input devices for percussion tasks. In: Proceedings of NIME, pp. 4–6 (2009)Google Scholar
  3. 3.
    Dahl, L.: Studying the timing of discrete musical air gestures. Comput. Music J. 39(2), 47–66 (2015)CrossRefGoogle Scholar
  4. 4.
    Dahl, L., Wang, G.: SoundBounce: physical metaphors in designing mobile music performance. In: Proceedings of the 2010 Conference on New Interfaces for Musical Expression, Sydney, Australia, pp. 178–181 (2010)Google Scholar
  5. 5.
    Dahl, S.: Playing the accent-comparing striking velocity and timing in an ostinato rhythm performed by four drummers. Acta Acustica United Acustica 90(4), 762–776 (2004)MathSciNetGoogle Scholar
  6. 6.
    Godøy, R.I., Haga, E., Jensenius, A.R.: Playing “air instruments”: mimicry of sound-producing gestures by novices and experts. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 256–267. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  7. 7.
    Havel, C., Desainte-Catherine, M.: Modeling an air percussion for composition and performance. In: Proceedings of the 2004 Conference on New Interfaces for Musical Expression, pp. 31–34. National University of Singapore (2004)Google Scholar
  8. 8.
    Kanke, H., Takegawa, Y., Terada, T., Tsukamoto, M.: Airstic drum: a drumstick for integration of real and virtual drums. In: Nijholt, A., Romão, T., Reidsma, D. (eds.) ACE 2012. LNCS, vol. 7624, pp. 57–69. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  9. 9.
    Luck, G., Toiviainen, P.: Ensemble musicians’ synchronization with conductors’ gestures: an automated feature-extraction analysis. Music Percept. 24(2), 189–200 (2006)CrossRefGoogle Scholar
  10. 10.
    Mäki-Patola, T.: User interface comparison for virtual drums. In: Proceedings of the 2005 Conference on New Interfaces for Musical Expression, pp. 144–147. National University of Singapore (2005)Google Scholar
  11. 11.
    Mathews, M.V.: Three dimensional baton and gesture sensor, US Patent 4,980,519, 25 December 1990Google Scholar
  12. 12.
    Repp, B.H.: Sensorimotor synchronization: a review of the tapping literature. Psychon. Bull. Rev. 12(6), 969–992 (2005)CrossRefGoogle Scholar
  13. 13.
    Sarasúa, Á., Guaus, E.: Beat tracking from conducting gestural data: a multi-subject study. In: Proceedings of the 2014 International Workshop on Movement and Computing, p. 118. ACM (2014)Google Scholar
  14. 14.
    Visi, F., Schramm, R., Miranda, E.: Gesture in performance with traditional musical instruments and electronics: use of embodied music cognition and multimodal motion capture to design gestural mapping strategies. In: Proceedings of the 2014 International Workshop on Movement and Computing, p. 100. ACM (2014)Google Scholar
  15. 15.
    Wanderley, M.M., Depalle, P.: Gestural control of sound synthesis. Proc. IEEE 92(4), 632–644 (2004)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Department of MusicUniversity of VirginiaCharlottesvilleUSA

Personalised recommendations