Estimation of Guitar Fingering and Plucking Controls Based on Multimodal Analysis of Motion, Audio and Musical Score

  • Alfonso Perez-CarrilloEmail author
  • Josep-Lluis Arcos
  • Marcelo Wanderley
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9617)


This work presents a method for the extraction of instrumental controls during guitar performances. The method is based on the analysis of multimodal data consisting of a combination of motion capture, audio analysis and musical score. High speed video cameras based on marker identification are used to track the position of finger bones and articulations and audio is recorded with a transducer measuring vibration on the guitar body. The extracted parameters are divided into left hand controls, i.e. fingering (which string and fret is pressed with a left hand finger) and right hand controls, i.e. the plucked string, the plucking finger and the characteristics of the pluck (position, velocity and angles with respect to the string). Controls are estimated based on probability functions of low level features, namely, the plucking instants (i.e. note onsets), the pitch and the distances of the fingers (both hands) to strings and frets. Note onsets are detected via audio analysis, the pitch is extracted from the score and distances are computed from 3D Euclidean Geometry. Results show that by combination of multimodal information, it is possible to estimate such a comprehensive set of control features, with special high performance for the fingering and plucked string estimation. Regarding the plucking finger and the pluck characteristics, their accuracy gets lower but improvements are foreseen including a hand model and the use of high-speed cameras for calibration and evaluation.


Guitar Instrumental control Motion capture Audio analysis 



A. Perez-Carrillo was supported by a Beatriu de Pinos grant 2010 BP-A 00209 by the Catalan Research Agency (AGAUR) and J. Ll. Arcos was supported by ICT -2011-8-318770 and 2009-SGR-1434 projects.


  1. 1.
    Abesser, G.S.J., Lukashevich, H.: Feature-based extraction of plucking and expression styles of the electric bass. In: Proceedings of the ICCASP Conference, Kyoto, Japan (2012)Google Scholar
  2. 2.
    Bevilacqua, F., Schnell, N., Rasamimanana, N., Zamborlin, B., Guédy, F.: Online gesture analysis and control of audio processing. In: Solis, J., Ng, K. (eds.) Musical Robots and Interactive Multimodal Systems. Springer Tracts in Advanced Robotics, vol. 74, pp. 127–142. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-22291-78CrossRefGoogle Scholar
  3. 3.
    Burns, A.M., Wanderley, M.M.: Visual methods for the retrieval of guitarist fingering. In: Proceedings of NIME, pp. 196–199. IRCAM, Paris (2006).
  4. 4.
    Chadefaux, D., Carrou, J.L.L., Fabre, B., Daudet, L.: Experimentally based description of harp plucking. J. Acoust. Soc. Am. 131(1), 844–855 (2012). Scholar
  5. 5.
    de Cheveigné, A., Kawahara, H.: YIN, a fundamental frequency estimator for speech and music. J. Acoust. Soc. Am. 111(4), 1917–1930 (2002). Scholar
  6. 6.
    Collins, N., Kiefer, C., Patoli, Z., White, M.: Musical exoskeletons: experiments with a motion capture suit. In: New Interfaces for Musical Expression, Sydney, Australia (2010)Google Scholar
  7. 7.
    Duxbury, C., Bello, J.P., Davies, M., Sandler, M.: Complex domain onset detection for musical signals. In: Proceedings of the DAFx. Queen Mary University, London (2003)Google Scholar
  8. 8.
    Eberly, D.H.: 3D Game Engine Design: A Practical Approach to Real-time Computer Graphics. Morgan Kaufmann Publishers Inc., San Francisco (2000)Google Scholar
  9. 9.
    Erkut, C., Välimäki, V., Karjalainen, M., Laurson, M.: Extraction of physical and expressive parameters for model-based sound synthesis of the classical guitar. In: Audio Engineering Society Convention 108, February 2000.
  10. 10.
    Guaus, E., Arcos, J.L.: Analyzing left hand fingering in guitar playing. In: Proceedings of the SMC. Universitat Pompeu Fabra, Barcelona, July 2010Google Scholar
  11. 11.
    Heijink, H., Meulenbroek, R.: On the complexity of classical guitar playing: functional adaptations to task constraints. J. Motor Behav. 34(4), 339–351 (2002)CrossRefGoogle Scholar
  12. 12.
    Linden, J.V.D., Schoonderwaldt, E., Bird, J.: Towards a real-time system for teaching novices correct violin bowing technique. In: IEEE International Workshop on Haptic Audio visual Environments and Games, pp. 81–86, November 2009Google Scholar
  13. 13.
    Maestre, E., Blaauw, M., Bonada, J., Guaus, E., Pérez, A.: Statistical modeling of bowing control applied to sound synthesis. IEEE Trans. Audio Speech Lang. Process. Special Issue on Virtual Analog Audio Effects and Musical Instruments (2010)Google Scholar
  14. 14.
    Maestre, E., Bonada, J., Blaauw, M., Pérez, A., Guaus, E.: Acquisition of violin instrumental gestures using a commercial EMF device. In: International Computer Music Conference, Copenhagen, Denmark (2007)Google Scholar
  15. 15.
    Norton, J.: Motion capture to build a foundation for a computer-controlled instrument by study of classical guitar performance. Ph.D. thesis, Department of Music, Stanford University (2008)Google Scholar
  16. 16.
    Penttinen, H., Välimäki, V.: A time-domain approach to estimating the plucking point of guitar tones obtained with an under-saddle pickup. Appl. Acoust. 65(12), 1207–1220 (2004). Scholar
  17. 17.
    Pérez-Carrillo, A., Wanderley, M.: Indirect acquisition of violin instrumental controls from audio signal with hidden markov models. IEEE/ACM Trans. Audio Speech Lang. Process. 23(5), 932–940 (2015)CrossRefGoogle Scholar
  18. 18.
    Pérez-Carrillo, A.: Enhancing spectral synthesis techniques with performance gestures using the violin as a case study. Ph.D. thesis, Universitat Pompeu Fabra, Barcelona, Spain (2009).
  19. 19.
    Pérez-Carrillo, A., Bonada, J., Maestre, E., Guaus, E., Blaauw, M.: Performance control driven violin timbre model based on neural networks. IEEE Trans. Audio Speech Lang. Process. 20(3), 1007–1021 (2012)CrossRefGoogle Scholar
  20. 20.
    Reboursière, L., Lähdeoja, O., Drugman, T., Dupont, S., Picard-Limpens, C., Riche, N.: Left and right-hand guitar playing techniques detection. In: NIME. University of Michigan, Ann Arbor, 21–23 May 2012Google Scholar
  21. 21.
    Scherrer, B.: Physically-informed indirect acquisition of instrumental gestures on the classical guitar: extracting the angle of release. Ph.D. thesis, McGill University, Montréal, QC (2013)Google Scholar
  22. 22.
    Scherrer, B., Depalle, P.: Extracting the angle of release from guitar tones: preliminary results. In: Proceedings of Acoustics, Nantes, France (2012)Google Scholar
  23. 23.
    Schneider, J.: The Contemporary Guitar, New Instrumentation, vol. 5. University of California Press, Berkeley (1985)Google Scholar
  24. 24.
    Schoonderwaldt, E., Guettler, K., Askenfelt, A.: An empirical investigation of bow-force limits in the Schelleng diagram. AAA 94(4), 604–622 (2008)Google Scholar
  25. 25.
    Traube, C., Depalle, P.: Deriving the plucking point location along a guitar string from a least-square estimation of a comb filter delay. In: IEEE Canadian Conference on Electrical and Computer Engineering, vol. 3, pp. 2001–2004, May 2003Google Scholar
  26. 26.
    Traube, C., Smith, J.O.: Estimating the plucking point on a guitar string. In: Proceedings of COST-G6 Conference on Digital Audio Effects, Verona, Italy (2000).
  27. 27.
    Visentin, P., Shan, G., Wasiak, E.B.: Informing music teaching and learning using movement analysis technology. Int. J. Music Educ. 26(1), 73–87 (2008). Scholar
  28. 28.
    Wanderley, M.M., Depalle, P.: Gestural control of sound synthesis. In: Proceedings of the IEEE, pp. 632–644 (2004)Google Scholar
  29. 29.
    Zhang, B., Wang, Y.: Automatic music transcription using audio-visual fusion for violin practice in home environment. Technical report, TRA7/09, Shool of Computing, National University of Singapore (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (, which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  • Alfonso Perez-Carrillo
    • 1
    • 3
    Email author
  • Josep-Lluis Arcos
    • 2
  • Marcelo Wanderley
    • 3
  1. 1.IDMILMcGill UniversityMontrealCanada
  2. 2.IIIA-CSICBarcelonaSpain
  3. 3.Music Technology Group, Universitat Pompeu FabraBarcelonaSpain

Personalised recommendations