Skip to main content

2004: The Electronic Sitar Controller

  • Chapter
  • First Online:
A NIME Reader

Part of the book series: Current Research in Systematic Musicology ((CRSM,volume 3))

  • 1129 Accesses

Abstract

This paper describes the design of an Electronic Sitar controller, a digitally modified version of Saraswati’s (the Hindu Goddess of Music) 19-stringed, pumpkin shelled, traditional North Indian instrument. The ESitar uses sensor technology to extract gestural information from a performer, deducing music information such as pitch, pluck timing, thumb pressure, and 3-axes of head tilt to trigger real-time sounds and graphics. It allows for a variety of traditional sitar technique as well as new performance methods. Graphical feedback allows for artistic display and pedagogical feedback. The ESitar uses a programmable Atmel microprocessor which outputs control messages via a standard MIDI jack.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Bagchee, S. (1998). Understanding raga music. Mumbai, India: Ceshwar Business Publications Inc.

    Google Scholar 

  • Benning, M., Kapur, A., Till, B., & Tzanetakis, G. (2007). Multimodal sensor analysis on sitar performance: Where is the beat?. In Proceedings of the IEEE International Workshop on Multimedia Signal Processing. Crete, Greece.

    Google Scholar 

  • Dixon, S. (2000). A lightweight multi-agent musical beat tracking system. In Pacific Rim International Conference on Artificial Intelligence (pp. 778–788).

    Google Scholar 

  • Eigenfeldt, A., & Kapur, A. (2008). Multi-agent multimodal performance analysis. In Proceedings of the International Computer Music Conference. Belfast, U.K.

    Google Scholar 

  • Fiebrink, R., Trueman, D., & Cook, P. R. (2009). A meta-instrument for interactive, on-the-fly learning. In Proceedings of the International Conference on New Interfaces for Musical Expression.

    Google Scholar 

  • Hunt, A., Wanderley, M. M., & Paradis, M. (2002). The importance of parameter mapping in electronic instrument design. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 149–154). Dublin, Ireland.

    Google Scholar 

  • Kapur, A., Darling, M., Diakopoulos, D., Murphy, J., Hochenbaum, J., Vallis, O., et al. (2011). The machine orchestra: An ensemble of human laptop performers and robotic musical instruments. Computer Music Journal, 35(4)

    Google Scholar 

  • Kapur, A., Percival, G., Lagrange, M., & Tzanetakis, G. (2007). Pedagogical transcription for multimodal sitar performance. In Proceedings of the International Conference on Music Information Retrieval. Vienna, Austria.

    Google Scholar 

  • Kapur, A., Essl, G., Davidson, P., & Cook, P. R. (2003). The electronic tabla controller. Journal of New Music Research, 32(4), 351–360.

    Article  Google Scholar 

  • Knapp, R., Jaimovich, J., & Coghlan, N. (2009). Measurement of motion and emotion during musical performance. In Proceedings of the IEEE International Conferece on Affective Computing and Intelligent Interaction (Vol. 1735–739).

    Google Scholar 

  • Laubier, S. (1998). The meta-instrument. Computer Music Journal, 22(1), 25–29.

    Article  Google Scholar 

  • Menon, R. R. (1974). Discovering Indian music. Mumbai, India: Somaiya Publications PVT. LTD.

    Google Scholar 

  • Merrill, D. (2003). Head-tracking for gestural and continuous control of parameterized audio effects. In Proceedings of the International Conference on New Interfaces for Musical Expression. Montreal, Canada.

    Google Scholar 

  • Puckette, M. (1996). Pure data: Another integrated computer music environment. In Proceedings of Second Intercollege Computer Music Concerts (pp. 37–41). Tachikawa, Japan.

    Google Scholar 

  • Sharma, S. (1997). Comparative study of evolution of music in India & the West. New Delhi, India: Pratibha Prakashan.

    Google Scholar 

  • Snyder, J. (2010). Exploration of an Adaptable Just Intonation System. Ph.D. thesis, Columbia University.

    Google Scholar 

  • Vir, R. A. (1998). Learn to play on Sitar. New Delhi, India: Punjab Publications.

    Google Scholar 

  • Waisvisz, M. (1985). The Hands, a set of remote MIDI-controllers. In Proceedings of the International Computer Music Conference (pp. 313–318). San Francisco, California: International Computer Music Association.

    Google Scholar 

  • Wilson, S., Gurevich, M., Verplank, B., & Stang, P. (2003). Microcontrollers in music HCI instruction: Reflections on our switch to the Atmel AVR platform. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 24–29). Montreal, Canada.

    Google Scholar 

  • Zolzer, U. (2002). DAFX: Digital audio effects. England: John Wiley & Sons Ltd.

    Book  Google Scholar 

Download references

Acknowledgements

We would like thank Bill Verplank, Michael Gurevich, Scott Wilson, and Max Mathews for their workshop at CCRMA on controllers and the Atmel microprocessor. We would also like that Asha Kapur, Dolly and Surinder Vohra for bringing a sitar from India to build the ESitar with. We would also like to thank Ustad Siraj Khan of the Mewati gharana for his training in the traditional classical theory and technique of sitar performance. Other thanks to Andrew Schloss, Peter Driessen, George Tzanetakis, Tae Hong Park, Dan Trueman, Curtis Bahn, Pavan Vohra, Sona Vohra, Meera Kapur, and Arun Kapur, for their support and inspiration.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ajay Kapur .

Editor information

Editors and Affiliations

Appendices

Author Commentary: Extended Performance, Modern Pedagogy And Symbiotic Mechatronic Control on the Electronic Sitar

Ajay Kapur

The ESitar project started as a simple idea of bringing together all the wonderful advancements of Computer Music technology that were being applied to Western music and non-western instruments, namely Indian Classical musical instruments. The idea of the “HyperInstrument” emerged with the Piano, Cello, Violin, Guitar, and even Saxophone and so much rich data was being collected for augmented and extended performance techniques. Taking these ideas to Raga and Tala with the series of KarmetiK instruments started an entire lifetime of research and artistic pursuit. Three lessons emerged from the building the ESitar, which I now take when creating any new musical instrument:

(1) Practicing your Interface: It has now been over 10 years since I built the first edition of the ESitar. One major success of this invention is that I always have one of my ESitars sitting open, in my studio, ready to play. This is the key to actually practicing the instrument and inventing new ways of performing with it on stage. I no longer play sitar; I play the ESitar. I cannot imagine getting on stage with a sitar without sensors on it! I would have no idea what to do! The idea of not being able to bend my instrument forward to trigger a sustained reverb. I can’t imagine not being able to play a synthesizer with just my frets. I can’t imagine not being able to wah-wah my audio from my live sitar, with the thumb pressure of my right hand! I would be naked! There has been four iterations on the design of the ESitar, each learning from the previous version, and each being pushed by new compositional motivations. This is how you build an instrument that stays with you for life. You practice it.

(2) Mining for Meaning: During my Ph.D., I quickly realized that because I had an instrument with sensors, I could solve research problems being asked in the Music Information Retrieval field, with much more accuracy and in real-time with my multimodal sensor solution. This allowed me to turn my ESitar into a system that knew what tempo I was playing (Benning et al. 2007), what pitch I was playing (Kapur et al. 2007), what section in a composition I was in (Eigenfeldt and Kapur 2008), and what emotional state I was in, all in real-time. This data is essential for the preservation techniques our team was building for Indian Classical music. Most music masters are only preserved by audio recordings. But this is not enough! What fret did they make that note? What posture were they sitting in? What angle was the instrument? All this information is essential to preservation and to building future pedagogical systems in the future. This has spawned an entire field of research from my Ph.D. students studying multimodal techniques for musical data analysis and pedagogy. There is so much more you can do with a NIME than just perform on stage.

(3) Human to Mechatronic Performance: One of the most important features of building the ESitar was the ability to communicate with my first robot, the MahaDeviBot. I could never have accomplished the deep interaction techniques with the mechatronic instrument by just using audio analysis. In a way, though lots of machine learning techniques have been experimented with in the lab and on stage, I feel that through my ESitar I am able to extend my control to an entire array of mechatronic musical actuators placed all around a concert hall. After exploring this as a solo musician for years, my true aesthetic goals were accomplished when creating the KarmetiK Machine Orchestra (Kapur et al. 2011), and building a framework to teach other artists and composer/performers to collaborate in this space together. This symbiotic relationship between humans with NIME’s and an Orchestra of 20 different musical mechatronic instruments with over 300 moving actuators was the beginning of a movement that will take our team decades to continue to explore.

In summary, NIME has evolved from just building instruments, to using new instruments to gather data to do advance music research that pushes pedagogical tools into the future as well creating new expressive music with the maturity of these instruments now lasting years and even decades as the field has evolved.

Expert Commentary: Old Lessons for New “Instruments”

Dan Trueman

As we know, the designer of digital musical instruments is invited to completely rethink the relationship between body and sound. No longer constrained by the physics of vibrating strings and resonators, the designer can freely imagine how the body might initiate and modulate sound; the prospect of “any sound you can imagine” always lies just around the proverbial corner, and we can’t wait to get our hands on it.

And therein lies the problem: how exactly should we “get our hands on it?” And, really, how do we “imagine” sound, apart from what we already know in the world, and apart from our bodies? These are tough questions that don’t yield easily, and they require inquiry from as many vantage points as possible, whether that be building entirely new instruments “from scratch” (like Serge de Laubier’s “meta-instrument” (Laubier 1998)), accessing the body or nervous system directly (the Hands (Waisvisz 1985), or the Biomuse system (Knapp et al. 2009)), or reinventing very old instruments (Jeff Snyder’s virtually just-intoned Contravielle and Birl, for instance (Snyder 2010)).

Kapur’s electronic sitar represents a particularly strong case of yet another widely-used approach, the so-called “hybrid” digital instrument. Sometimes this manifests itself in fairly crude, if effective, ways: put a couple sensors on just about any existing instrument, create a basic mapping from sensor to sound or signal processing, and voilà, we have a new hybrid instrument that might just be well worth some time and effort. Pre-existing instruments that have drawn the sustained attention of musicians over decades and centuries (or millennia, in the case of the sitar!) have so much to offer, they are natural starting points for exploring digital instrument design and provide clues as to what sorts of connections between body and sound are particularly compelling; the design of the violin, for instance, is much more than an awkward compromise between human body and resonating body. Simply putting a couple sensors on one of these instruments might get us pretty far, if only because these instruments are already so rich to begin with, but Kapur et al. went much further, carefully designing a multifaceted system that was informed by the rich history and performance practices of the sitar, while not remaining confined by these practices.

Consider, for instance, their approach to capturing what the sitarist’s hands are doing. For the left hand, an array of circuits identify which fret is being played while a contact microphone and pitch detection quantifies the all-important pulling and detuning of the string; this combination captures much more than either element could on its own. And for the right hand, a pair of condenser microphones embedded in the bridge of the instrument combine with a force-sensing-resistor (FSR) under the thumb to capture pluck time and direction; I am particularly intrigued by the careful placement of the FSR, which serves to identify pluck direction, but would seem, since this thumb is a crucial fixed connection between body and instrument, to offer the potential to reveal much more. This is a rich and multifaceted sensor array, and it is not hard to imagine how it might be used with a contemporary machine-learning mapping system like Rebecca Fiebrink’s Wekinator (Fiebrink et al. 2009). At the time, Kapur mapped it to a variety of things, some immediately “instrumental” in character, others more “accompanimental” in nature (like drones and tabla riffs), and others yet further afield, like visuals.

I had the great privilege of performing alongside Kapur and his ESitar shortly after it was built. One of the compelling things about his hybrid instrument is that it is hybrid in multiple ways; yes it is a hybrid digital and acoustic instrument, but it is also a hybrid instrument and system. Playing with him felt like playing with an instrumentalist, but also, say, an airline pilot; he was driving a system that had its own momentum, its own sources of energy, while also subtly manipulating an instrument that was immediately responsive to the nuanced expressive variations of his playing. In some ways, it is this latter hybridity that is so new and exciting about digital instruments; our “instruments” can take on lives of their own in ways that acoustic ones simply can’t, and these systems offer unprecedented compositional and design possibilities. Even this many years after the ESitar was built, it feels like we are at the beginning of something new and exciting, where the lessons from these beautiful old instruments continue to inform and inspire our construction of new instruments and systems.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Kapur, A., Lazier, A.J., Davidson, P., Scott Wilson, R., Cook, P.R. (2017). 2004: The Electronic Sitar Controller. In: Jensenius, A., Lyons, M. (eds) A NIME Reader. Current Research in Systematic Musicology, vol 3. Springer, Cham. https://doi.org/10.1007/978-3-319-47214-0_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47214-0_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47213-3

  • Online ISBN: 978-3-319-47214-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics