Advertisement

Lyricon (Lyrics + Earcons) Improves Identification of Auditory Cues

  • Yuanjing Sun
  • Myounghoon JeonEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9187)

Abstract

Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of “lyricons” (lyrics + earcons [1]) has provided a novel approach to devising auditory cues in electronic products, by combining the concurrent two layers of musical speech and earcons (short musical motives). An experiment on sound-function meaning mapping was conducted between earcons and lyricons. It demonstrated that lyricons significantly more enhanced the relevance between the sound and the meaning compared to earcons. Further analyses on error type and confusion matrix show that lyricons showed a higher identification rate and a shorter mapping time than earcons. Factors affecting auditory cue identification and application directions of lyricons are discussed.

Keywords

Auditory display Auditory icons Auditory user interface Cognitive mapping Earcons Lyricons Sonification 

References

  1. 1.
    Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and icons: their structure and common design principles. Hum. Comput. Interact. 4, 11–44 (1989)CrossRefGoogle Scholar
  2. 2.
    Gaver, W.W.: Auditory icons: using sound in computer interfaces. Hum. Comput. Interact. 2, 167–177 (1986)CrossRefGoogle Scholar
  3. 3.
    Walker, B.N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D.K., Dingler, T., Jeon, M.: Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum. Factors 55(1), 157–182 (2012)CrossRefGoogle Scholar
  4. 4.
    Jeon, M., Walker, B.N.: Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans. Accessible Comput. 3(3), 10:11–26 (2011)CrossRefGoogle Scholar
  5. 5.
    Stevens, C., Brennan, D., Parker, S.: Simultaneous manipulation of parameters of auditory icons to convey direction, size, and distance: effects on recognition and interpretation. In: Proceedings of the International Conference on Auditory Display (ICAD2004), Sydney, Australia (2014)Google Scholar
  6. 6.
    Brewster, S.A.: Using nonspeech sounds to provide navigation cues. ACM Trans. Comput. Hum. Interact. (TOCHI) 5, 224–259 (1998)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Baldwin, C.L.: Auditory Cognition and Human Performance: Research and Applications. CRC Press, Boca Raton (2012)CrossRefGoogle Scholar
  8. 8.
    Jeon, M.: Lyricons (lyrics + earcons): Designing a new auditory cue combining speech and sounds. In: Stephanidis, C. (ed.) HCII 2013, Part I. CCIS, vol. 373, pp. 342–346. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  9. 9.
    Jeon, M., Lee, J.-H.: The ecological AUI (auditory user interface) design and evaluation of user acceptance for various tasks on smartphones. In: Kurosu, M. (ed.) HCII/HCI 2013, Part IV. LNCS, vol. 8007, pp. 49–58. Springer, Heidelberg (2013)Google Scholar
  10. 10.
    Ballas, J.A.: Common factors in the identification of an assortment of brief everyday sounds. J. Exp. Psychol. Hum. Percept. Perform. 19(2), 250–267 (1993)CrossRefGoogle Scholar
  11. 11.
    Deutsch, D., Henthorn, T., Lapidis, R.: Illusory transformation from speech to song. J. Acoust. Soci. Am. 129(4), 2245–2252 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Mind Music Machine Lab, Cognitive & Learning SciencesMichigan Technological UniversityHoughtonUSA
  2. 2.Mind Music Machine Lab, Computer ScienceMichigan Technological UniversityHoughtonUSA

Personalised recommendations