Advertisement

Emerging User Interfaces

  • Joseph L. Dvorak

Keywords

Force Feedback Multimodal User Interface Haptic Interface Speech Recognizer Interface Mechanism 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. What’s a Theremin?, 2005, Theremin World, http://www.thereminworld. com/article.asp?id=17Google Scholar
  2. Phiffer D. and Zer-Aviv M., 2006, Atlas Gloves, A DIY Gesture Interface for Google Earth, http://atlasgloves.org/Google Scholar
  3. Nielsen M., Störring M., Moeslund T. B., et. al., 2003, A Procedure For Developing Intuitive And Ergonomic Gesture Interfaces For Man-Machine Interaction, Technical Report CVMT 03-01, Aalborg UniversityGoogle Scholar
  4. Cassell J.,1998, A Framework For Gesture Generation And Interpretation, Computer Vision in Human-Machine Interaction, Cambridge University PressGoogle Scholar
  5. McNeill, D., 1992, Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.Google Scholar
  6. Statner T, Auxier J., Ashbrook D., et al, 2000, The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring, Fourth International Symposium on Wearable Computers, pp 87-94Google Scholar
  7. Sharma R., Yeasin M, Krahnstoever N, et. al., 2003, Speech–Gesture Driven Multimodal Interfaces for Crisis Management, Proceedings Of The IEEE, Vol. 91, No. 9Google Scholar
  8. Boff, K.R., Kaufman, L., and Thomas, J.P. (Eds.)., 1986, Handbook of Perception and Human Performance: Sensory Processes and Perception. Vols. 1 and 2. Wiley, New York, N.YGoogle Scholar
  9. Tan H. Z. and Pentland A., 1997, Tactual Displaysfor Wearable Computing, Personal Technologies, Springer – Verlag, Vo1 1, pp. 225 – 230.Google Scholar
  10. Burdea G. C., 1999, Haptic Feedback for Virtual Reality, Proceedings of International Workshop on Virtual prototyping, Laval, France, pp. 87-96, MayGoogle Scholar
  11. Immersion Corporation, 2003, CyberGraspTM v1.2 User’s Guide, http://www.immersion.com/3d/docs/CyberGrasp_030619.pdfGoogle Scholar
  12. Dennerlein J., Becker T., Johnson P., et. al., 2003, ‘‘Frustrating Computer Users Increases Exposure to Physical Factors,’’ Proceedings of the International Ergonomics Association, Seoul, Korea, August 24-29Google Scholar
  13. Pompei F. J., Sharon T., Buckley S. J., et. al., 2002, An Automobile-Integrated System for Assessing and Reacting to Driver Cognitive Load, 2002-21-0061, Society of Automotive Engineers, Inc.Google Scholar
  14. MacLean K., design of haptic interfaces, 2001, UIST 2001, http://www.cs.ubc.ca/∼˜maclean/Google Scholar
  15. Gemperle F., Ota N., and Siewiorek D., 2001, Design of a Wearable Tactile Display, Proceedings of the 5th International Symposium on Wearable Computers, Zurich, Switzerland, 7–9 October 2001Google Scholar
  16. Bloomfield A. and Badler N. I., 2003, A Low Cost Tactor Suit for Vibrotactile Feedback, Computer & Information Science / Technical Report 2003, University of PennsylvaniaGoogle Scholar
  17. Lindeman1 R. W. Page R., Yanagida Y., et. al., Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system, Proceedings of the ACM symposium on Virtual reality software and technology (VRST), pp. 146 - 149Google Scholar
  18. Logitech IFeel Optical Mouse, 2001, http://www.hardware-one.com/reviews.asp? aid=221&page=5Google Scholar
  19. Cardin S., Thalmann D., and Vexo F., 2005, Wearable Obstacle Detection System for visually impaired People, HAPTEX 2005, Hanover, Germany, December 1, 2005Google Scholar
  20. Ito K., Okamoto M., Akita J., et. al., 2005, CyARM: an Alternative Aid Device for Blind Persons, Proceedings of Computer-Human Interaction (CHI 2005), pp. 1483-1486Google Scholar
  21. Sound Foresight Ltd, 2006, UltraCane, http://www.soundforesight.co.uk/index.htmlGoogle Scholar
  22. Djeraba C., 2005, State of the Art of Eye Tracking, Technical report LIFL – 7-2005, Université des Sciences et Technologies de LilleGoogle Scholar
  23. Selker T., Lockerd A., and Martinez J., 2001, Eye-R, a Glasses-Mounted Eye Motion Detection Interface, CHI’01 Interactive PostersGoogle Scholar
  24. Duchowski, A. T., 2002, “A Breadth-First Survey of Eye Tracking Applications”, Behavior Research Methods, Instruments, & Computers (BRMIC), 34(4), November 2002, pp.455-470.Google Scholar
  25. Smith J., D., 2005, ViewPointer: Lightweight Calibration-Free Eye Tracking for Ubiquitous Handsfree Deixis, Queen’s University Kingston, Ontario, CanadaGoogle Scholar
  26. Babcock J. S., Pelz J. B., and Peak J., The Wearable Eyetracker: A Tool for the Study of High-level Visual Tasks, Proceedings of the Military Sensing Symposia Specialty Group on Camouflage, Concealment, and Deception, Tucson, ArizonaGoogle Scholar
  27. Glenstrup A. and Angell-Nielsen T., 1995, ‘‘Eye Controlled Media, Present and Future State,’’ Technical Report, University of CopenhagenGoogle Scholar
  28. Hallett, P. E. (1986), Eye movements, in K. Boff, L. Kaufman & J. Thomas, eds, ‘Handbook of Perception and Human Performance I’, pp. 10.25-10.28.Google Scholar
  29. Campbell C.S. and Maglio P.P., 2001, “A robust algorithm for reading detection,” ACM Workshop on Perceptive User InterfacesGoogle Scholar
  30. Tomono A., Iida M., and Ohmura K, 1991, Method of detecting eye fixation using image processing. U.S. Patent: 5,818,954, 1991. ATR Communication Systems Research Laboratories.Google Scholar
  31. Oviatt, S. L.,. ‘‘Mutual disambiguation of recognition errors in a multimodal architecture’’, Proceedings of the Conference on Human Factors in Computing Systems (CHI’99), ACM Press: New York, 576-583. 1999Google Scholar
  32. Marsic I. and Dorohonceanu B., 2003, Flexible User Interfaces for Group Collaboration, International Journal Of Human–Computer Interaction, 15(3), 337–360CrossRefGoogle Scholar
  33. Cohen P.R., Johnston M., McGee D.R., et al, 1997, QuickSet: Multimodal Interaction for Distributed Applications," Intl. Multimedia Conference, ‘97, 31-40Google Scholar
  34. Robbins C. A., 2005, Extensible MultiModal Environment Toolkit (EMMET): A Toolkit for Prototyping and Remotely Testing Speech and Gesture Based Multimodal Interfaces, PhD. Thesis, Department of Computer Science New York UniversityGoogle Scholar
  35. Johnston M., Cohen P. R., McGee D, et al, 1997, Unification-based Multimodal Integration, ,Proceedings of the eighth conference on European chapter of the Association for Computational Linguistics, 281 – 288Google Scholar
  36. Carpenter, R. 1992. The logic of typed feature structures. Cambridge University Press, Cambridge, England.MATHGoogle Scholar
  37. Reeves L. M., Lai J., Larson J. A. Et al, 2004, Guidelines For Multimodal User Interface Design, Communications Of The ACM January 2004/Vol. 47, No. 1, pp. 57-59CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC. 2008

Authors and Affiliations

  • Joseph L. Dvorak
    • 1
  1. 1.MotorolaUSA

Personalised recommendations