Skip to main content

Visualizing Emotion in Musical Performance Using a Virtual Character

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 3638))

Abstract

We describe an immersive music visualization application which enables interaction between a live musician and a responsive virtual character. The character reacts to live performance in such a way that it appears to be experiencing an emotional response to the music it ‘hears.’ We modify an existing tonal music encoding strategy in order to define how the character perceives and organizes musical information. We reference existing research correlating musical structures and composers’ emotional intention in order to simulate cognitive processes capable of inferring emotional meaning from music. The ANIMUS framework is used to define a synthetic character who visualizes its perception and cognition of musical input by exhibiting responsive behaviour expressed through animation.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. The ANIMUS Project: A Framework for the Creation of Emotional and Rational Social Agents. Computing Science Department, University of Alberta, http://www.cs.ualberta.ca/~dtorres/projects/animus

  2. Cooke, D.: The Language of Music. Oxford University Press, New York (1959)

    Google Scholar 

  3. Cycling ’74. Max/MSP

    Google Scholar 

  4. Deutsch, D., Feroe, J.: The Internal Representation of Pitch Sequences in Tonal Music. Psychological Review 88, 503–522 (1981)

    Article  Google Scholar 

  5. Electronic Arts. The Sims

    Google Scholar 

  6. Koelsch, S., Gunter, T., Friederici, A.D., Schroger, J.: Brain indices of music processing: “Nonmusicians” are Musical. Cognitive Neuroscience 12, 520–541 (2000)

    Article  Google Scholar 

  7. Levin, G., Lieberman, Z.: In-situ Speech Visualization in Real-Time Interactive Installation and Performance. In: Proceedings of the 3rd International Symposium on Non-Photorealistic Animation and Rendering, pp. 7–14. ACM Press, New York (2004)

    Chapter  Google Scholar 

  8. Oliver, W., Yu, J., Metois, E.: The Singing Tree: Design of an Interactive Musical Interface. In: DIS 1997: Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods and Techniques, pp. 261–264. ACM Press, New York (1997)

    Chapter  Google Scholar 

  9. Ox, J.: Two Performances in the 21st Century Virtual Color Organ. In: Proceedings of the Fourth Conference on Creativity and Cognition, pp. 20–24. ACM Press, New York (2002)

    Chapter  Google Scholar 

  10. Patel, A.D., Gibson, E., Ratner, J., Besson, M., Holcomb, P.J.: Processing Syntactic Relations in Language and Music: An Event-Related Potential Study. Journal of Cognitive Neuroscience 10, 717–733 (1998)

    Article  Google Scholar 

  11. Puckette, M., Apel, T., Zicarelli, D.: Real-Time Audio Analysis Tools for Pd and MSP. In: Proceedings of the International Computer Music Conference, pp. 109–112. International Computer Music Association (1998)

    Google Scholar 

  12. Taylor, R., Torres, D., Boulanger, P.: Using Music to Interact with a Virtual Character. In: Proceedings of New Interfaces for Musical Expression, pp. 220–223 (2005)

    Google Scholar 

  13. Taylor II, R.M., Hudson, T.C., Seeger, A., Weber, H., Juliano, J., Helser, A.T.: VRPN: A Device-Independent, Network Transparent VR Peripheral System. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, pp. 55–61. ACM Press, New York (2001)

    Chapter  Google Scholar 

  14. Torres, D., Boulanger, P.: A Perception and Selective Attention System for Synthetic Creatures. In: Proceedings of the Third International Symposium On Smart Graphics, pp. 141–150 (2003)

    Google Scholar 

  15. Torres, D., Boulanger, P.: The ANIMUS Project: A Framework for the Creation of Interactive Creatures in Immersed Environments. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, pp. 91–99. ACM Press, New York (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Taylor, R., Boulanger, P., Torres, D. (2005). Visualizing Emotion in Musical Performance Using a Virtual Character. In: Butz, A., Fisher, B., Krüger, A., Olivier, P. (eds) Smart Graphics. SG 2005. Lecture Notes in Computer Science, vol 3638. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11536482_2

Download citation

  • DOI: https://doi.org/10.1007/11536482_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28179-5

  • Online ISBN: 978-3-540-31905-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics