Abstract
One trend in Human Computer Interaction is to extend the sensory-motor capabilities of computer systems to better match the natural communication means of humans. Although the multiplicity of modalities opens a vast world of experience, our understanding of how they relate to each other is still unclear and the terminology is unstable. In this paper we present our definitions and existing frameworks useful for the design of multimodal interaction.
Chapter PDF
Similar content being viewed by others
References
Bolt, R., (1980), Put-that-there: Voice and gesture at the graphics interface, in Proceedings of the 7th annual conference on Computer graphics and interactive techniques, pp. 262–270.
Bouchet, J., Nigay, L., 2004, ICARE: A Component-Based Approach for the Design and Development of Multimodal Interfaces, in Proceedings of CHI’2004, ACM Press.
Bernsen, N., (1994), A revised generation of the taxonomy of output modalities, Esprit Project AMODEUS Working Paper RP5-TM-WP11.
Coutaz, J., et al., 1995, Four easy pieces for assessing the usability of multimodal interaction: The CARE properties, in Proceedings of Interact”95, Chapman&Hall, pp. 115–120
Foley, J., Wallace, V., Chan, P., (1984), The Human Factors of computer Graphics interactiontechniques, IEEE computer Graphics and Applications, 4(11), pp. 13–48.
Harrison, B., et al., 1998, Squeeze me, Hold me, Tilt Me! An exploration of Manipulative User Interface, in Proceedings of CHI’98, ACM Press, pp. 17–24.
Hemjslev, L., 1947, Structural Analysis of language, Studia Phonetica, Vol. 1, pp. 69–78.
Mackinlay, J., Card, S., Robertson, G., (1990), A Semantic Analysis of the Design Space of Input Devices, Human Computer Interaction, Lawrence Erlbaum, 5(2,3), pp. 145–190.
Martin, J. C., 1997, TYCOON: Theoretical Framework and Software Tools for Multimodal Interfaces. Intelligence and Multimodality in Multimedia Interfaces, AAAI Press.
Norman, D. A., 1986, Cognitive Engineering, User Centered System Design, New Perspectives on Computer Interaction, Lawrence Erlbaum Associates, pp. 31–61.
Nigay, L., 1994, Conception et modélisation logicielles des systèmes interactifs: application aux interfaces multimodales, PhD dissertation University of Grenoble 1, 315 pages.
Nigay, L., Coutaz, J., 1995, A Generic Platform for Addressing the Multimodal Challenge, in Proceedings of CHI’95, ACM Press, pp. 98–105.
Nigay, L., Coutaz, J., 1996, Espaces conceptuels pour l”interaction multimédia et multimodale, TSI, 15(9), AFCET&Hermes Publ, pp. 1195–1225.
Oviatt, S. et al., 2000, Designing the user interface for multimodal speech and gesture applications, Human Computer Interaction, Lawrence Erlbaum, 15(4), pp. 263–322.
Turk, M., Robertson, G., (ed.), 2000, Perceptual user Interfaces, Communications of the ACM, 43(3), ACM Press, pp. 32–70.
Vernier, F., Nigay, L., 2000, A Framework for the Combination and Characterization of Output Modalities, in Proceedings of DSV-IS2000, LNCS, Springer-Verlag., pp. 32–48.
Zouinar, M., et al., 2003, Multimodal Interaction on Mobile Artefacts, Communicating with smart objects-developing technology for usable pervasive computing systems, Hermes Penton Science, ISBN 1-9039-9636-8.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer Science + Business Media, Inc.
About this paper
Cite this paper
Nigay, L. (2004). Design Space for Multimodal Interaction. In: Jacquart, R. (eds) Building the Information Society. IFIP International Federation for Information Processing, vol 156. Springer, Boston, MA. https://doi.org/10.1007/978-1-4020-8157-6_32
Download citation
DOI: https://doi.org/10.1007/978-1-4020-8157-6_32
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4020-8156-9
Online ISBN: 978-1-4020-8157-6
eBook Packages: Springer Book Archive