An Optically Based Direct Manipulation Interface for Human-Computer Interaction in an Augmented World
Augmented reality (AR) constitutes a very powerful three-dimensional user interface for many “hands—on” application scenarios in which users cannot sit at a conventional desktop computer. To fully exploit the AR paradigm, the computer must not only augment the real world, it also has to accept feedback from it. Such feedback is typically collected via gesture languages, 3D pointers, or speech input - all tools which expect users to communicate with the computer about their work at a meta-level rather than just letting them pursue their task When the computer is capable of deducing progress directly from changes in the real world, the need for special abstract communication interfaces can be reduced or even eliminated. In this paper, we present an optical approach for analyzing and tracking users and the objects they work with. In contrast to emerging workbench and metaDESK approaches, our system can be set up in any room after quickly placing a few known optical targets in the scene. We present three demonstration scenarios to illustrate the overall concept and potential of our approach and then discuss the research issues involved.
KeywordsAugmented Reality Real Object Virtual Object Foreground Object Mobile Object
Unable to display preview. Download preview PDF.
- 1.D. Curtis, D. Mizell, P. Gruenbaum, and A. Janin Several Devils in the Details: Making an AR App Work in the Airplane Factory. lrst International Workshop on Augmented Reality (IWAR’98), San Francisco, 1998.Google Scholar
- 2.S. Feiner, B. Maclntyre, M. Haupt, and E. Solomon. Windows on the world: 2D windows for 3D augmented reality. UIST’93, pages 145–155, Atlanta, GA, 1993.Google Scholar
- 4.T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka. A stereo machine for video-rate dense depth mapping and its new applications. 15th IEEE Computer Vision and Pattern Recognition Conference (CVPR), 1996.Google Scholar
- 5.G. Klinker, D. Stricker, and D. Reiners. Augmented reality for exterior construction applications. In W. Barfield and T. Caudell, eds., Augmented Reality and Wearable Computers. Lawrence Erlbaum Press, 1999.Google Scholar
- 6.G. Klinker, D. Stricker, and D. Reiners. Augmented Reality: A Balance Act between High Quality and Real-Time Constraints. 1. International Symposium on Mixed Reality (ISMR’99), Y. Ohta and H. Tamura, eds., “Mixed Reality — Merging Real and Virtual Worlds”, March 9–11, 1999.Google Scholar
- 7.P. Maes, T. Darrell, B. Blumberg, and A. Pentland. The ALIVE system: Full-body interaction with autonomous agents. Computer Animation ‘85, 1995.Google Scholar
- 8.D. Reiners, D. Stricker, G. Klinker, and S. Müller. Augmented Reality for Construction Tasks: Doorlock Assembly. 1 rst International Workshop on Augmented Reality (IWAR’98), San Francisco, 1998.Google Scholar
- 9.E. Rose, D. Breen, K.H. Ahlers, C. Crampton, M. Tuceryan, R. Whitaker, and D. Greer. Annotating real-world objects using augmented reality. Computer Graphics: Developments in Virtual Environments. Academic Press, 1995.Google Scholar
- 10.D. Stricker, G. Klinker, and D. Reiners. A fast and robust line-based optical tracker for augmented reality applications. lrst International Workshop on Augmented Reality (IWAR’98), San Francisco, 1998.Google Scholar
- 11.B. Ullmer and H. Ishii. The metaDESK: Models and prototypes for tangible user interfaces. UIST’97, pages 223–232, Banff, Alberta, Canada, 1997.Google Scholar
- 12.A. Webster, S. Feiner, B. Maclntyre, W. Massie, and T. Krueger. Augmented reality in architectural construction, inspection, and renovation. ASCE 3. Congress on Computing in Civil Engineering, pp. 913–919, Anaheim, CA, 1996.Google Scholar