Skip to main content

Auditory Representations of a Graphical User Interface for a Better Human-Computer Interaction

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5954))

Abstract

As part of a project to improve human computer interaction mostly for blind users, a survey with 50 blind and 100 sighted users included a questionnaire about their user habits during everyday use of personal computers. Based on their answers, the most important functions and applications were selected and results of the two groups were compared. Special user habits and needs of blind users are described. The second part of the investigation included collecting of auditory representations (auditory icons, spearcons etc.), mapping with visual information and evaluation with the target groups. Furthermore, a new design method for auditory events and class was introduced, called “auditory emoticons”. These use non-verbal human voice samples to represent additional emotional content. Blind and sighted users evaluated different auditory representations for the selected events, including spearcons for different languages. Auditory icons using environmental, familiar sounds as well emoticons are received very well, whilst spearcons seem to be redundant except menu navigation for blind users.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Boyd, L.H., Boyd, W.L., Vanderheiden, G.C.: The Graphical User Interface: Crisis, Danger and Opportunity. Journal of Visual Impairment and Blindness, 496–502 (December 1990)

    Google Scholar 

  2. http://www.freedomscientific.com/fs_products/software_jaws.asp last viewed (December 2009)

  3. http://www.gwmicro.com/Window-Eyes/ last viewed (December 2009)

  4. Mynatt, E.D.: Transforming Graphical Interfaces into Auditory Interfaces for Blind Users. Human-Computer Interaction 12, 7–45 (1997)

    Article  Google Scholar 

  5. Crispien, K., Petrie, H.: Providing Access to GUI’s Using Multimedia System – Based on Spatial Audio Representation. J. Audio Eng. Soc. 95th Convention Preprint, New York (1993)

    Google Scholar 

  6. Nees, M.A., Walker, B.N.: Encoding and Representation of Information in Auditory Graphs: descriptive reports of listener strategies for understanding data. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 6 (2008)

    Google Scholar 

  7. Nees, M.A., Walker, B.N.: Listener, Task, and Auditory Graph: Toward a Conceptual Model of Auditory Graph Comprehension. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 266–273 (2007)

    Google Scholar 

  8. Gaver, W.W.: The SonicFinder, a prototype interface that uses auditory icons. Human Computer Interaction 4, 67–94 (1989)

    Article  Google Scholar 

  9. Mynatt, E.D.: Designing Auditory Icons. In: Proc. of the International Conference on Auditory Display (ICAD 1994), Santa Fe, pp. 109–120 (1994)

    Google Scholar 

  10. Petrie, H., Morley, S.: The use of non-speech sounds in non-visual interfaces to the MS Windows GUI for blind computer users. In: Proc. of the International Conference on Auditory Display (ICAD 1998), Glasgow, p. 5 (1998)

    Google Scholar 

  11. Wersényi, G.: Localization in a HRTF-based Minimum Audible Angle Listening Test on a 2D Sound Screen for GUIB Applications. J. Audio Eng. Soc. 115th Convention Preprint, New York (2003)

    Google Scholar 

  12. Wersényi, G.: Localization in a HRTF-based Minimum-Audible-Angle Listening Test for GUIB Applications. Electronic Journal of Technical Acoustics 1 (EJTA), 16 (2007), http://www.ejta.org

  13. Wersényi, G.: What Virtual Audio Synthesis Could Do for Visually Disabled Humans in the New Era. AES Convention Paper, presented at the AES Tokyo Regional Convention, Tokyo, Japan, pp. 180–183 (2005)

    Google Scholar 

  14. Wersényi, G.: Localization in a HRTF-based Virtual Audio Synthesis using additional High-pass and Low-pass Filtering of Sound Sources. Journal of the Acoust. Science and Technology 28(4), 244–250 (2007)

    Article  Google Scholar 

  15. Wersényi, G.: Effect of Emulated Head-Tracking for Reducing Localization Errors in Virtual Audio Simulation. IEEE Transactions on Audio, Speech and Language Processing (ASLP) 17(2), 247–252 (2009)

    Article  Google Scholar 

  16. Wersényi, G.: Simulation of small head-movements on a virtual audio display using head-phone playback and HRTF synthesis. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 73–78 (2007)

    Google Scholar 

  17. Gaver, W.W.: Auditory Icons: using sound in computer interfaces. Human-Computer Interactions 2(2), 167–177 (1986)

    Article  Google Scholar 

  18. Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and Icons: Their structure and common design principles. Human-Computer Interaction 4, 11–44 (1989)

    Article  Google Scholar 

  19. Gaver, W.W.: Everyday listening and auditory icons. Doctoral thesis, Univ. of California, San Diego (1988)

    Google Scholar 

  20. Gygi, B., Shafiro, V.: From signal to substance and back: insights from environmental sound research to auditory display design. In: Proc. of the 15th International Conference on Auditory Display (ICAD 2009), Copenhagen, pp. 240–251 (2009)

    Google Scholar 

  21. Gygi, B.: Studying environmental sounds the watson way. The Journal of the Acoustical Society of America 115(5), 2574 (2004)

    Google Scholar 

  22. Gygi, B., Kidd, G.R., Watson, C.S.: Spectral-temporal factors in the identification of envi-ronmental sounds. The Journal of the Acoustical Society of America 115(3), 1252–1265 (2004)

    Article  Google Scholar 

  23. Ballas, J.A.: Common factors in the identification of an assortment of brief everyday sounds. Journal of Exp. Psychol. Human 19(2), 250–267 (1993)

    Article  Google Scholar 

  24. Gygi, B., Shafiro, V.: The incongruency advantage in elderly versus young normal-hearing listeners. The Journal of the Acoustical Society of America 125(4), 2725 (2009)

    Google Scholar 

  25. Fernström, M., Brazil, E.: Human-Computer Interaction design based on Interactive Sonification – hearing actions or instruments/agents. In: Proc. of 2004 Int. Workshop on Interactive Sonification, Bielefeld Univ. (2004)

    Google Scholar 

  26. Heller, L.M., Wolf, L.: When Sound Effects Are Better Than The Real Thing. The Journal of the Acoustical Society of America 111(5/2), 2339 (2002)

    Google Scholar 

  27. Vargas, M.L.M., Anderson, S.: Combining speech and earcons to assist menu navigation. In: Proc. of the International Conference on Auditory Display (ICAD 2003), Boston, pp. 38–41 (2003)

    Google Scholar 

  28. Walker, B.N., Nance, A., Lindsay, J.: Spearcons: Speech-based earcons improve navigation performance in auditory menus. In: Proc. of the International Conference on Auditory Display (ICAD 2006), London, pp. 63–68 (2006)

    Google Scholar 

  29. Palladino, D.K., Walker, B.N.: Learning rates for auditory menus enhanced with spearcons versus earcons. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 274–279 (2007)

    Google Scholar 

  30. Dingler, T., Lindsay, J., Walker, B.N.: Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 6 (2008)

    Google Scholar 

  31. Wersényi, G.: Evaluation of user habits for creating auditory representations of different software applications for blind persons. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 5 (2008)

    Google Scholar 

  32. Wersényi, G.: Evaluation of auditory representations for selected applications of a Graphical User Interface. In: Proc. of the 15th International Conference on Auditory Display (ICAD 2009), Copenhagen, pp. 41–48 (2009)

    Google Scholar 

  33. http://www.w3.org/ (Last viewed December 2009)

  34. http://www.independentliving.com/prodinfo.asp?number=CSH1W (Last viewed December 2009)

  35. Cobb, N.J., Lawrence, D.M., Nelson, N.D.: Report on blind subjects’ tactile and auditory recognition for environmental stimuli. Journal of Percept. Mot. Skills 48(2), 363–366 (1979)

    Google Scholar 

  36. http://guib.tilb.sze.hu/ (last viewed December 2009)

  37. http://www.freesound.org (last viewed December 2009)

  38. http://www.soundsnap.com (last viewed December 2009)

  39. Gygi, B., Divenyi, P.L.: Identifiability of time-reversed environmental sounds. Abstracts of the Twenty-seventh Midwinter Research Meeting, Association for Research in Otolaryngology 27 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wersényi, G. (2010). Auditory Representations of a Graphical User Interface for a Better Human-Computer Interaction. In: Ystad, S., Aramaki, M., Kronland-Martinet, R., Jensen, K. (eds) Auditory Display. CMMR ICAD 2009 2009. Lecture Notes in Computer Science, vol 5954. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12439-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12439-6_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12438-9

  • Online ISBN: 978-3-642-12439-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics