Pull and Push: Proximity-Aware User Interface for Navigating in 3D Space Using a Handheld Camera
In the 3D object controlling or virtual space wandering tasks, it is necessary to provide the efficient zoom operation. The common method is using the combination of the mouse and keyboard. This method requires users familiar with the operation which needs much time to practice. This paper presents two methods to recognize the zoom operation by sensing users’ pull and push movement. People only need to hold a camera in hand and when they pull or push hands, our approach will sense the proximity and translate it into the zoom operation in the tasks. By user studies, we have compared different methods’ correct rate and analyzed the factors which will affect the approach’s performance. The results show that our methods are real-time and high accurate.
KeywordsAccuracy Rate Corner Point Finite State Machine Virtual Space High Accuracy Rate
Unable to display preview. Download preview PDF.
- 1.Bouguet, J.V.: Pyramidal implementation of the Lucas Kanade Feature Tracker Description of the algorithm. Intel. Corporation Microprocessor Research Labs (1999)Google Scholar
- 2.Harrison, C., Anind, K.D.: Lean and Zoom: Proximity-Aware UserInterface and Content Magnification. In: Proc. CHI, pp. 507–510 (2008)Google Scholar
- 3.ISO. Ergonomic requirements for office work with visual display terminals (VDTs) - Requirements for nonkeyboard input devices. ISO 9241-9 (2000) Google Scholar
- 4.Shi, J., Tomasi, C.: Good features to track. In: Proc. IEEE Comput. Soc. Conf. Comput. Vision and Pattern Recogn., pp. 593–600 (1994)Google Scholar
- 5.Sohn, M., Lee, G.: ISeeU: camera-based User interface for a handheld computer. In: ACM MobilCHI 2005, pp. 299–302 (2005)Google Scholar
- 6.Wang, J., Canny, J.: TinyMotion: Camera Phone Based Interaction Methods. In: Proc. CHI 2006, pp. 339–344 (2006)Google Scholar