Advertisement

MuSA.RT

  • Elaine ChewEmail author
Chapter
Part of the International Series in Operations Research & Management Science book series (ISOR, volume 204)

Abstract

The purpose of this chapter is to describe MuSA.RT Opus 2, an interactive system for tonal visualization of music at multiple scales, and to present examples of the types of musical features and attributes that can be abstracted and visualized by the system. MuSA.RT aims to create an environment by which musical performances can be mapped in real-time to a concrete and visual metaphor for tonal space, wherein we can see the establishment and evolution of the tonal context. In this environment, expert musicians will be able to see the tonal structures of what they play, initiated listeners will be able to visually follow the structures that they hear, and novices can learn to hear the structures that they see. MuSA.RT is both an interactive art installation that can convert musical performances into mathematically elegant graphics and a scientific tool for visualizing the inner workings of tonal induction and tracking algorithms. In this chapter we describe the mapping strategies for transforming a MIDI stream into tonal structures in 3D space, and our solution for overcoming the challenge of real-time concurrent processing of data streams; we will also give examples and present case studies of visual mappings of music by Pachelbel, Bach, and Barber. The reader can download the latest version of the MuSA.RT software, MuSA_RT, from the Mac App Store (https://itunes.apple.com/ca/app/musa-rt/id506866959?mt=12, cited 30 August 2013), follow the examples posted at https://musa-rt.blogspot.com, or try out new ones themselves.

Keywords

Camera View Tonal Pattern Tonal Structure Pitch Class Close Triad 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

We thank Carol Krumhansl for her comments and suggestions that have helped improve the clarity of this chapter, and Craig Sapp for encoding the MIDI examples.

References

  1. 1.
    Chew, E.: Towards a mathematical model of tonality. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge (2000)Google Scholar
  2. 2.
    Chew, E.: Slicing it all ways: mathematical models for induction, approximation and segmentation using the Spiral Array. INFORMS J. Computing 18(3), 305–320 (2006)Google Scholar
  3. 3.
    Chew, E.: Out of the grid and into the spiral: geometric interpretations of and comparisons with the Spiral Array model. Comput. Musicology 15, 51–72 (2008)Google Scholar
  4. 4.
    Chew, E., Chen, Y.-C.: Determining context-defining windows: pitch spelling using the Spiral Array. In: Proceedings of the 4th International Conference for Music Information Retrieval, pp. 26–30 (2003b)Google Scholar
  5. 5.
    Chew, E., Chen, Y.-C.: Mapping MIDI to the Spiral Array: disambiguating pitch spellings. In: Bhargava, H.K., Ye, N. (eds.) Computational Modeling and Problem Solving in the Networked World, Proceedings of the 8th INFORMS Computer Society Conference, pp. 259–275, Kluwer Academic Publishers (2003a)Google Scholar
  6. 6.
    Chew, E., Chen, Y.-C.: Real-time pitch spelling using the Spiral Array. Comput. Music J. 29(2), 61–76 (2005)Google Scholar
  7. 7.
    Chew, E., François, A.R.J.: MuSA.RT—Music on the Spiral Array. Real-Time. In: Proceedings of the ACM Multimedia Conference, pp. 448–449 (2003)Google Scholar
  8. 8.
    Cohn, R.: Introduction to Neo-Riemannian Theory: A survey and historical perspective. J. Music Ther. 42(2), 167–180 (1998)Google Scholar
  9. 9.
    François, A.R.J.: Components for immersion. In: Proceedings of the IEEE International Conference on Multimedia and Expo, pp. 777–780 (2002)Google Scholar
  10. 10.
    François, A.R.J.: A hybrid architectural style for distributed parallel processing of generic data streams. In: Proceedings of the International Conference on Software Engineering, pp. 367–376 (2004)Google Scholar
  11. 11.
    François, A.R.J.: MuSA\_RT https://itunes.apple.com/ca/app/musa-rt/id506866959?mt=12 (2012). Accessed 30 August 2013
  12. 12.
    François, A.R.J.: Modular flow scheduling framework (MFSM). http://mfsm.sourceforge.net. Accessed 4 July 2013
  13. 13.
    Longuet-Higgins, H.C.: Letter to a musical friend. The music review, 23, 244–248 (1962a)Google Scholar
  14. 14.
    Longuet-Higgins, H.C.: Second letter to a musical friend. The music review, 23, 271–280 (1962b)Google Scholar
  15. 15.
    Sapp, C.S.: Harmonic visualizations of tonal music. In: Proceedings of the International Computer Music Conference, pp. 423–430. http://ccrma.stanford.edu/craig/keyscape (2001). Accessed 4 July 2013
  16. 16.
    Toiviainen, P., Krumhansl, C.L.: Measuring and modeling real-time responses to music: the dynamics of tonality induction. Perception 32(6), 741–766 (2003)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Centre for Digital MusicQueen Mary University of LondonLondonUK

Personalised recommendations