Abstract
We present two different Computer Vision based systems that enable multiple users to concurrently manipulate graphic objects presented over tabletop displays. The two solutions have different hardware layouts and use two different algorithms for gesture analysis and recognition. The first one is a media-handling application that can be used by co-located and remote users. The second is a knowledge-building application where users can manipulate the contents of a wiki as a visual concept map. The performance of both systems is evaluated and expounded. A conceptual framework is introduced, providing the fundamental guidelines for the design of natural interaction languages on tabletops.
Similar content being viewed by others
References
Aifanti S, Strintzis N (2002) A gesture recognition system using 3D data. Thessaloniki University, Greece
Apperley M, Dahlberg B, Jeffries A, Paine L, Phillips M, Rogers B (2001) Development and application of large interactive display surfaces. In: Symposium on Human Computer Interaction Palmerston North, New Zealand
Bèrard C (2001) Bare-hand human–computer interaction. Technische Universitt, Berlin, Germany
Cadoz C (1994) Les realites virtuelles. Dominos, Flammarion
Dietz P, Leigh D (2001) DiamondTouch: a multi-user touch technology. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology. UIST’01. ACM Press, New York, NY, pp 219–226
Frigola M, Fernandez J, Aranda J (2003) Visual human machine interface by gestures. Universitat Politecnica de Catalunya, Barcelona, Spain
GTCO CalComp (2007) Interwrite Meeting Suite. Available at: http://www.gtcocalcomp.com/
Isaacs J, Foo S (2004) Hand pose estimation for American sign language recognition. FAMU—FSU College of Engeneering, Tallahassee, FL
Kapuscinski T, Wysocki M (2001) Hand gesture recognition for man–machine interaction. Rzeszow University, Rzeszow, Poland
Leibe B, Starner T, Ribarsky W, Wartell Z, Kru D, Singletary B, Hidges L (2000) The perceptive workbench: toward spontaneous and natural interaction in semi-immersive virtual environments. GVU Center, Georgia Institute of Technology, Atlanta
Letessier J, Bérard F (2004) Visual tracking of bare fingers for interactive surfaces. In: Proceedings of the 17th annual ACM symposium on User interface software and technology (UIST’04)
Marsic I, Medl A, Flanagan J (2002) Natural communication with information systems. Rutgers University, Piscataway, NJ (Aug)
MeetingWorks. Available at: http://www.entsol.com/
Mulder A (1996) Hand gestures for HCI—research on human movement behaviour reviewed in the context of hand centred input. School of Kinesiology, Simon Fraser University, Canada (February)
Mulder A (1996) Hand festures for HCI. Tech report 96-I. School of Kinesiology, Simon Fraser University, Canada
Norman DA (1988) Psychology of everyday things. Basic, New York, NY
Novak JD (1998) Learning, creating, and using knowledge: concept maps as facilitative tools in schools and corporations. Erlbaum, Mahweh, NJ
O’Hagan R, Zelinsky A (2000) Visual gesture interfaces for virtual environments. Australian Nat. University, ACT, Australia
Otsu N (1979) A thresholding selection method for grey-level histograms. IEEE Trans Syst Man Cybern SMC-9:62–66
Rekimoto J (2002) SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. In: Proceedings of CHI’02 Conference on human factors in computing systems: changing our world, changing ourselves, ACM, pp 113–120
Ryall K, Ringel Morris M., Everitt K, Forlines C, Shen C (2006) Experiences with and observations of direct-touch tables. IEEE International Workshop on Horizontal Interactive Human–Computer Systems (TableTop)
Scott SD, Grant KD, Mandryk RL (2003) System guidelines for co-located, collaborative work on a tabletop display. In: Proceedings of ECSCW’03, European Conference Computer-Supported Cooperative Work, Helsinki, Finland, September 14–18
Shen C, Vernier F, Forlines C, Ringel M (2004) DiamondSpin: an extensible toolkit for around-the-table interaction. In: Proceedings of CHI’04, ACM Press, p 167–174
Smart Technologies. Smart Board. Avilable at: http://www.smattech.com/
Starner T, Auxier J, Ashbrook D, Gandy M (2000) The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. College of Computing, Georgia Institute of Technology, Atlanta, GA
Stefik M, Bobrow DG, Foster G, Lanning S, Tatar D (1987) WYSIWIS revised: Early experiences with multiuser interfaces. In: Proceedings of ACM Transactions on Office Information Systems
Tse E, Greenberg S (2004) Rapidly prototyping single display groupware through the SDGToolkit. In: Proceedings of the Fifth Australasian User Interface Conference, Australian Computer Society, p 101–110
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Baraldi, S., Del Bimbo, A. & Landucci, L. Natural interaction on tabletops. Multimed Tools Appl 38, 385–405 (2008). https://doi.org/10.1007/s11042-007-0195-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-007-0195-7