Abstract
As mobile-phone design moves toward a touch-screen form factor, the visually disabled are faced with new accessibility challenges. The mainstream interaction model for touch-screen devices relies on the user having the ability to see spatially arranged visual icons, and to interface with these icons via a smooth glass screen. An inherent challenge for blind users with this type of interface is its lack of tactile feedback. In this paper we explore the concept of using a combination of spatial audio and accelerometer technology to enable blind users to effectively operate a touch-screen device. We discuss the challenges involved in representing icons using sound and we introduce a design framework that is helping us tease out some of these issues. We also outline a set of proposed user-studies that will test the effectiveness of our design using a Nokia N97. The results of these studies will be presented at ICCHP 2010.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Brewster, S., Leplâtre, G., Crease, M.: Using Non-Speech Sounds in Mobile Computing Devices. Glasgow Interactive Systems Group, Dept. Computing Science, University of Glasgow (1998)
Stevens, R.D., Wright, P.C., Edwards, A.D.N., Brewster, S.A.: An Audio Glance at Syntactic Structure Based on Spoken Form. In: Proc. of ICCHP 1996, Linz, pp. 627–635 (1996)
Brewster, S., Wright, P.C., Edwards, A.D.N.: A detailed investigation into the effectiveness of earcons. In: Kramer, G., Smith, S. (eds.) Proc. International Conference on Auditory Display, USA (1994)
Stevens, R.D., Brewster, S.A., Wright, P.C., Edwards, A.D.N.: Design and Evaluation of an Auditory Glance at Algebra for Blind Readers. In: Kramer, G., Smith, S. (eds.) Proc. International Conference on Auditory Display, USA (1994)
Brown, G.J., Cooke, M.: Perceptual grouping of musical sounds: A computational model. Journal of New Music Research 23(2), 107–132 (1994)
Kashino, K., Nakadai, K., Kinoshita, T., Tanaka, H.: Organization of Hierarchical Perceptual Sounds: Music Scene Analysis with Autonomous Processing Modules and a Quantitative Information Integration Mechanism. In: Proc. International Joint Conf. on Artificial Intelligence, pp. 158–164 (1995)
Bregman, A.: Auditory Scene Analysis: The Perceptual Organization of Sound. MIT Press, USA (1994)
Neff, F., Pitt, I., Keheo, A.: A Consideration of Perceptual Interaction in an Auditory Prolific Mobile Device, Spatial Audio for Mobile Devices. In: 9th International Conference on Human Interaction with Mobile Devices and Services, Mobile HCI 2007, Singapore (2007)
Neff, F., Kehoe, A., Pitt, I.J.: User Modeling to Support the Development of an Auditory Help System. In: Matoušek, V., Mautner, P. (eds.) TSD 2007. LNCS (LNAI), vol. 4629, pp. 390–397. Springer, Heidelberg (2007)
Begault, D.R.: Virtual Acoustics, Aeronautics and Communications. Presented at the 101st Convention, Los Angeles, California, November 8-11; Journal of the Audio Engineering Society (1996)
Cabrera, D., Ferguson, S., Laing, G.: Development of Auditory Alerts for Air Traffic Control Consoles. In: 119th Convention, NY, USA, October 7-10; Journal of the Audio Engineering Society (2005)
Brown, L., Brewster, S., Ramloll, R., Burton, M., Riedel, B.: Design Guidelines for Audio Presentation of Graphs and Tables. In: Proceedings of International Conference on Auditory Display, Boston, MA, USA, July 6-9 (2003)
JSR-234 Expert Group: Advanced Multimedia Supplements API for JavaTM2 Micro Edition. Nokia Corporation 2004-2005 (May 17, 2005)
The Khronos Group, http://www.khronos.org/opensles/
The Khronos Group: OpenSL ES Specification, Version 1.0. The Khronos Group Inc. (2007-2009) (March 16, 2009)
Zheng, H., Black, N., Harris, N.: Position-Sensing Technologies for Movement Analysis in Stroke Rehabilitation. Medical and Biological Engineering and Computing 43(4), 413–420 (2005)
Wigdor, D., Balakrishnan, R.: TiltText: Using Tilt for Text Input to Mobile Phones. In: Proceedings UIST 2003, Vancouver, BC, Canada, pp. 81–90 (2003)
Cho, S., Murray-smith, R., Choi, C., Sung, Y., Lee, K., Kim, Y.: Dynamics of Tilt Based Browsing. In: Proceedings of CHI 2007, San Jose, USA (2007)
Darnauer, J., Garrity, S., Kim, T.: Orientation-based Interaction for Mobile Devices (June 10, 2007), http://hci.stanford.edu/~srk/cs377a-mobile/project/final/darnauer-garrity-kim.pdf
Eslambolchilar, P., Murray-Smith, R.: Tilt-based automatic zooming and scaling in mobile devices-a state-space implementation. In: Brewster, S., Dunlop, M.D. (eds.) Mobile HCI 2004. LNCS, vol. 3160, pp. 120–131. Springer, Heidelberg (2004)
Wrigley, S.N., Brown, G.J.: A model of auditory attention. Technical Report CS-00- 07, Speech and Hearing Research Group, University of Sheffield (2000)
Jones, D.M., Macken, W.J.: Irrelevant Tones Produce an Irrelevant Speech Effect: Implications for Phonological Coding in Working Memory. Journal of Experimental Psychology: Learning, Memory, and Cognition 19(2), 369–381 (1993)
Jones, D.M., Madden, C., Miles, C.: Privileged access by irrelevant speech to short-term memory: The role of changing state. Quarterly Journal of Experimental Psychology 44A, 645–669 (1992)
Baddeley, A.: Your Memory: A User’s Guide, 2nd edn., Prion, UK, June 1 (1996)
Dingler, T., Lindsay, J., Walker, B.: Learnability of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech. In: Proceedings of the 14th International Conference on Auditory Display, Paris, France, June 24-27 (2008)
Walker, B., Lindsay, J.: The Georgia Tech Sonification Lab, School of Psychology and the School of Interactive Computing. Georgia Institute of Technology, USA
Buxton, W., Gaver, W., Bly, S.: Auditory Interfaces: The Use of Non-Speech Audio at the Interface (1994) (Unpublished)
Brewster, S.: Nonspeech Auditory Output. In: Sears, A., Jacko, J. (eds.) Human-Computer Interaction: Fundamentals, Evolving Technologies and Emerging Applications. Taylor & Francis Group, LLC CRC Press, USA (2009)
Darnauer, J., Garrity, S., Kim, T.: Orientation-based Interaction for Mobile Devices. (June 10, 2007), http://hci.stanford.edu/ srk/cs377a-mobile/project/final/ darnauer-garrity-kim.pdf
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Neff, F., Mehigan, T.J., Pitt, I. (2010). Accelerometer & Spatial Audio Technology: Making Touch-Screen Mobile Devices Accessible. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds) Computers Helping People with Special Needs. ICCHP 2010. Lecture Notes in Computer Science, vol 6179. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-14097-6_28
Download citation
DOI: https://doi.org/10.1007/978-3-642-14097-6_28
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-14096-9
Online ISBN: 978-3-642-14097-6
eBook Packages: Computer ScienceComputer Science (R0)