Abstract
As the form factors of computational devices diversify, the concept of eyes-free interaction is becoming increasingly relevant: it is no longer hard to imagine use scenarios in which screens are inappropriate. However, there is currently little consensus about this term. It is regularly employed in different contexts and with different intents. One key consequence of this multiplicity of meanings is a lack of easily accessible insights into how to best build an eyes-free system. This paper seeks to address this issue by thoroughly reviewing the literature, proposing a concise definition and presenting a set of design principles. The application of these principles is then elaborated through a case study of the design of an eyes-free motion input system for a wearable device.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ’eyes-free’ interaction techniques for wearable devices. In: Proc. of CHI 2003, ACM Press, New York (2003)
Cheok, A.D., Ganesh Kumar, K., Prince, S.: Micro-Accelerometer Based Hardware Interfaces for Wearable Computer Mixed Reality Applications. In: Horrocks, I., Hendler, J. (eds.) ISWC 2002. LNCS, vol. 2342, Springer, Heidelberg (2002)
Cho, I., Sunwoo, J., Son, Y., Oh, M., Lee, C.: Development of a Single 3-axis Accelerometer Sensor Based Wearable Gesture Recognition Band. In: Proceedings of Ubiquitous Intelligence and Computing, Hong Kong (2007)
Crease, M.C., Brewster, S.A.: Making progress with sounds: The design and evaluation of an audio progress bar. In: Proc. of ICAD 1998, Glasgow, UK (1998)
Costanza, E., Inverso, S.A., Allen, R., Maes, P.: Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures. In: Proc. of CHI 2007, ACM Press, New York (2007)
Gaver, W.W., Smith, R.B, O’Shea, T.: Effective sounds in complex systems: the ARKOLA simulation. In: Proc. of CHI 1991, ACM Press, New York (1991)
Kallio, S., Kela, J., Mäntyjärvi, J., Plomp, J.: Visualization of hand gestures for pervasive computing environments. In: AVI 2006. Proceedings of the Working Conference on Advanced Visual interfaces, ACM Press, New York (2006)
Kurtenbach, G., Sellen, A., Buxton, W.: An empirical evaluation of some articulatory and cognitive aspects of ”marking menus”. Human Computer Interaction 8(1), 1–23 (1993)
Oakley, I., O’Modhrain, S.: Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface. In: Proceedings of World Haptics 2005, Pisa, Italy, IEEE Press, Los Alamitos (2005)
Oakley, I., Park, J.: The Effect of a Distracter Task on the Recognition of Tactile Icons. In: The proceedings of WorldHaptics 2007, Tsukuba, Japan, IEEE Press, IEEE Press (2007)
Oakley, I., Park, J.: A motion-based marking menu system. In: Extended Abstracts of CHI 2007, ACM Press, New York (2007)
Partridge, K., Chatterjee, S., Sazawal, V., Borriello, G., Want, R.: Tilt-Type: Accelerometer-Supported Text Entry for Very Small Devices. In: Proc. of ACM UIST, ACM Press, New York (2002)
Pirhonen, A., Brewster, S.A., Holguin, C.: Gestural and Audio Metaphors as a Means of Control for Mobile Devices. In: Proceedings of CHI 2002, ACM Press, New York (2002)
Poupyrev, I., Maruyama, S., Rekimoto, J.: Ambient touch: designing tactile interfaces for handheld devices. In: Proc. of ACM UIST 2002, ACM Press, New York (2002)
Rekimoto, J.: Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In: Proc. of ISWC 2001 (2001)
Roto, V., Oulasvirta, A.: Need for non-visual feedback with long response times in mobile HCI. In: proceedings of WWW 2005, ACM Press, New York (2005)
Smyth, T.N., Kirkpatrick, A.E.: A new approach to haptic augmentation of the GUI. In: Proceedings of ICMI 2006, ACM Press, New York (2006)
Sutcliffe, A.: On the effective use and reuse of HCI knowledge. ACM Trans. Comput.-Hum. Interact. 7(2), 197–221 (2000)
Tactaid VBW32, www.tactaid.com/skinstimulator.html
Tan, H.Z., Srinivasan, M.A., Eberman, B., Cheng, B.: Human factors for the design of force-reflecting haptic interfaces. In: Proceedings of ASME Dynamic Systems and Control Division, pp. 353–359. ASME, Chicago, IL (1994)
Watson, M., Sanderson, P.: Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing. Human Factors 46(3), 497–517 (2004)
Wigdor, D., Balakrishnan, R.: TiltText: Using tilt for text input to mobile phones. In: Proc. of ACM UIST 2003, ACM Press, New York (2003)
Williamson, J., Murray-Smith, R., Hughes, S.: Shoogle: excitatory multimodal interaction on mobile devices. In: Proceedings CHI 2007., ACM Press, New York (2007)
Witt, H., Nicolai, T., Kenn, H.: Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications. In: Proceedings of IEEE International Conference on Pervasive Computing and Communications, IEEE Computer Society Press, Los Alamitos (2006)
Xsens Motion Technologies, www.xsens.com
Yin, M., Zhai, S.: The benefits of augmenting telephone voice menu navigation with visual browsing and search. In: Proc. of ACM CHI 2006, ACM Press, New York (2006)
Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., Baudisch, P.: Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: Proceedings of CHI 2007, ACM Press, New York (2007)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Oakley, I., Park, JS. (2007). Designing Eyes-Free Interaction. In: Oakley, I., Brewster, S. (eds) Haptic and Audio Interaction Design. HAID 2007. Lecture Notes in Computer Science, vol 4813. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76702-2_13
Download citation
DOI: https://doi.org/10.1007/978-3-540-76702-2_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-76701-5
Online ISBN: 978-3-540-76702-2
eBook Packages: Computer ScienceComputer Science (R0)