Abstract
We present a robust detector for deictic gestures based on a time-of-flight (TOF) camera, a combined range and intensity image sensor. Pointing direction is used to determine whether the gesture is intended for the system at all and to assign different meanings to the same gesture depending on pointing direction. We use the gestures to control a slideshow presentation: Making a “thumbs-up” gesture while pointing to the left or right of the screen switches to the previous or next slide. Pointing at the screen causes a “virtual laser pointer” to appear. Since the pointing direction is estimated in 3D, the user can move freely within the field of view of the camera after the system was calibrated. The pointing direction is measured with an absolute accuracy of 0.6 degrees and a measurement noise of 0.9 degrees near the center of the screen.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Oggier, T., Büttgen, B., Lustenberger, F., Becker, G., Rüegg, B., Hodac, A.: SwissRangerTM SR3000 and first experiences based on miniaturized 3D-TOF cameras. In: Ingensand, K. (ed.) Proc. 1st Range Imaging Research Day, Zurich, pp. 97–108 (2005)
Baudel, T., Beaudouin-Lafon, M.: CHARADE: Remote control of objects using free-hand gestures. Communications of the ACM 36, 28–35 (1993)
Hofemann, N., Fritsch, J., Sagerer, G.: Recognition of deictic gestures with context. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 334–341. Springer, Heidelberg (2004)
Moeslund, T.B., Nøregaard, L.: Recognition of deictic gestures for wearable computing. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 112–123. Springer, Heidelberg (2006)
Moeslund, T.B., Granum, E.: Modelling and estimating the pose of a human arm. Machine Vision and Applications 14, 237–247 (2003)
Haker, M., Böhme, M., Martinetz, T., Barth, E.: Geometric invariants for facial feature tracking with 3D TOF cameras. In: Proceedings of the IEEE International Symposium on Signals, Circuits & Systems (ISSCS), Iasi, Romania, vol. 1, pp. 109–112 (2007)
Haker, M., Martinetz, T., Barth, E.: Multimodal sparse features for object detection. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009. LNCS, vol. 5769, pp. 923–932. Springer, Heidelberg (2009)
Kalman, R.E.: A new approach to linear filtering and prediction problems. Transactions of the ASME, Series D, Journal of Basic Engineering 82, 35–45 (1960)
Nickel, K., Stiefelhagen, R.: Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. In: International Conference on Multimodal Interfaces, pp. 140–146 (2003)
Gudmundsson, S.A., Aanæs, H., Larsen, R.: Effects on measurement uncertainties of time-of-flight cameras. In: Proceedings of the IEEE International Symposium on Signals, Circuits & Systems (ISSCS), vol. 1, pp. 1–4 (2007)
Kranstedt, A., Lücking, A., Pfeiffer, T., Rieser, H., Wachsmuth, I.: Deixis: How to Determine Demonstrated Objects Using a Pointing Cone. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 300–311. Springer, Heidelberg (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Haker, M., Böhme, M., Martinetz, T., Barth, E. (2010). Deictic Gestures with a Time-of-Flight Camera. In: Kopp, S., Wachsmuth, I. (eds) Gesture in Embodied Communication and Human-Computer Interaction. GW 2009. Lecture Notes in Computer Science(), vol 5934. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12553-9_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-12553-9_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-12552-2
Online ISBN: 978-3-642-12553-9
eBook Packages: Computer ScienceComputer Science (R0)