Advertisement

Augmenting Bioacoustic Cognition with Tangible User Interfaces

  • Isak HermanEmail author
  • Leonardo Impett
  • Patrick K. A. Wollner
  • Alan F. Blackwell
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9183)

Abstract

Using a novel visualization and control interface – the Mephistophone – we explore the development of a user interface for acoustic visualization and analysis of bird calls. Our intention is to utilize embodied computation as an aid to acoustic cognition. The Mephistophone demonstrates ‘mixed initiative’ design, where humans and systems collaborate toward creative and purposeful goals. The interaction modes of our prototype allow the dextral manipulation of abstract acoustic structure. Combining information visualization, timbre-space exploration, collaborative filtering, feature learning, and human inference tasks, we examine the haptic and visual affordances of a 2.5D tangible user interface (TUI). We explore novel representations in the audial representation-space and how a transition from spectral to timbral visualization can enhance user cognition.

Keywords

Tangible user interfaces Embodied interaction  Bioacoustics Information visualization Collaborative filtering 

References

  1. 1.
    Caclin, A., McAdams, S., Smith, B.K., Winsberg, S.: Acoustic correlates of timbre space dimensions: a confirmatory study using synthetic tonesa). J. Acoust. Soc. Am. 118(1), 471–482 (2005)CrossRefGoogle Scholar
  2. 2.
    Dand, D., Hemsley, R.: Obake: interactions on a 2.5d elastic display. In: Proceedings of the Adjunct Publication of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 109–110. UIST 2013 Adjunct. ACM, New York, NY, USA (2013). http://doi.acm.org/10.1145/2508468.2514734
  3. 3.
    Dennis, J., Tran, H.D., Li, H.: Spectrogram image feature for sound event classification in mismatched conditions. IEEE Sig. Process. Lett. 18(2), 130–133 (2011)CrossRefGoogle Scholar
  4. 4.
    Depraetere, M., Pavoine, S., Jiguet, F., Gasc, A., Duvail, S., Sueur, J.: Monitoring animal diversity using acoustic indices: implementation in a temperate woodland. Ecol. Ind. 13(1), 46–54 (2012)CrossRefGoogle Scholar
  5. 5.
    Edge, D., Blackwell, A.: Correlates of the cognitive dimensions for tangible user interface. J. Vis. Lang. & Comput. 17(4), 366–394 (2006)CrossRefGoogle Scholar
  6. 6.
    Farina, A., Pieretti, N., Piccioli, L.: The soundscape methodology for long-term bird monitoring: a mediterranean europe case-study. Ecol. Inform. 6(6), 354–363 (2011)CrossRefGoogle Scholar
  7. 7.
    Hook, J., Taylor, S., Butler, A., Villar, N., Izadi, S.: A reconfigurable ferromagnetic input device. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 51–54. ACM (2009)Google Scholar
  8. 8.
    Jorda, S., Kaltenbrunner, M., Geiger, G., Bencina, R.: The reactable*. In: Proceedings of the International Computer Music Conference (ICMC 2005), Barcelona, Spain, pp. 579–582 (2005)Google Scholar
  9. 9.
    Kuhn, G.: Description of a color spectrogram. J. Acoust. Soc. Am. 76(3), 682–685 (1984)CrossRefGoogle Scholar
  10. 10.
    Leithinger, D., Ishii, H.: Relief: a scalable actuated shape display. In: Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 221–222. ACM (2010)Google Scholar
  11. 11.
    McAdams, S.: Perspectives on the contribution of timbre to musical structure. Comput. Music J. 23(3), 85–102 (1999)CrossRefGoogle Scholar
  12. 12.
    Menon, V., Levitin, D., Smith, B.K., Lembke, A., Krasnow, B., Glazer, D., Glover, G., McAdams, S.: Neural correlates of timbre change in harmonic sounds. Neuroimage 17(4), 1742–1754 (2002)CrossRefGoogle Scholar
  13. 13.
    Mohamed, A.R., Dahl, G.E., Hinton, G.: Acoustic modeling using deep belief networks. IEEE Trans. Audio, Speech, Lang. Process. 20(1), 14–22 (2012)CrossRefGoogle Scholar
  14. 14.
    Obrist, M.K., Pavan, G., Sueur, J., Riede, K., Llusia, D., Márquez, R.: Bioacoustics approaches in biodiversity inventories. Abc Taxa 8, 68–99 (2010)Google Scholar
  15. 15.
    Pieretti, N., Farina, A., Morri, D.: A new methodology to infer the singing activity of an avian community: the acoustic complexity index (aci). Ecol. Indic. 11(3), 868–873 (2011)CrossRefGoogle Scholar
  16. 16.
    Piper, B., Ratti, C., Ishii, H.: Illuminating clay: a 3-d tangible interface for landscape analysis. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 355–362. ACM (2002)Google Scholar
  17. 17.
    Purchase, H.C., Andrienko, N., Jankun-Kelly, T.J., Ward, M.: Theoretical foundations of information visualization. In: Kerren, A., Stasko, J.T., Fekete, J.-D., North, C. (eds.) Information Visualization. LNCS, vol. 4950, pp. 46–64. Springer, Heidelberg (2008) CrossRefGoogle Scholar
  18. 18.
    Rasmussen, M.K., Pedersen, E.W., Petersen, M.G., Hornbæk, K.: Shape-changing interfaces: a review of the design space and open research questions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 735–744. ACM (2012)Google Scholar
  19. 19.
    Segnini, R., Sapp, C.: Scoregram: displaying gross timbre information from a score. In: Kronland-Martinet, R., Voinier, T., Ystad, S. (eds.) CMMR 2005. LNCS, vol. 3902, pp. 54–59. Springer, Heidelberg (2006) CrossRefGoogle Scholar
  20. 20.
    Sequera, R.S.: Timbrescape: a musical timbre and structure visualization method using tristimulus data. In: Proceedings of the 9th International Conference on Music Perception and Cognition (ICMPC), Bologna (2006)Google Scholar
  21. 21.
    Siedenburg, K.: An exploration of real-time visualizations of musical timbre. In: Welcome to the 3rd International Workshop on Learning Semantics of Audio Signals, p. 17 (2009)Google Scholar
  22. 22.
    Sueur, J., Pavoine, S., Hamerlynck, O., Duvail, S.: Rapid acoustic survey for biodiversity appraisal. PLoS One 3(12), e4065 (2008)CrossRefGoogle Scholar
  23. 23.
    Tillmann, B., McAdams, S.: Implicit learning of musical timbre sequences: statistical regularities confronted with acoustical (dis) similarities. J. Exp. Psychol. Learn. Mem. Cogn. 30(5), 1131 (2004)CrossRefGoogle Scholar
  24. 24.
    Wollner, P.K.A., Herman, I., Pribadi, H., Impett, L., Blackwell, A.F.: Mephistophone. Technical Report UCAM-CL-TR-855, University of Cambridge, Computer Laboratory, June 2014. http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-855.pdf
  25. 25.
    Yao, L., Niiyama, R., Ou, J., Follmer, S., Della Silva, C., Ishii, H.: Pneui: pneumatically actuated soft composite materials for shape changing interfaces. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 13–22. ACM (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Isak Herman
    • 1
    Email author
  • Leonardo Impett
    • 2
  • Patrick K. A. Wollner
    • 2
  • Alan F. Blackwell
    • 1
  1. 1.Computer LaboratoryUniversity of CambridgeCambridgeUK
  2. 2.Department of EngineeringUniversity of CambridgeCambridgeUK

Personalised recommendations