Advertisement

Towards a Framework for Whole Body Interaction with Geospatial Data

  • Florian Daiber
  • Johannes Schöning
  • Antonio Krüger
Chapter
Part of the Human-Computer Interaction Series book series (HCIS)

Abstract

Since 6,000 years humans have used maps to navigate through space and solve other spatial tasks. Nearly at all times maps were drawn or printed on a piece of paper (or on material like stone or papyrus) of a certain size. Nowadays maps can be displayed on a wide range of electronic devices starting from small screen mobile devices or highly interactive large multi-touch screens. Due to common computer power Geographic Information Systems (GIS) are allowing a rich set of operations on spatial data. However, most GIS require a high degree of expertise from its users, making them difficult to be operated by laymen. In this work we discuss the possibilities of navigating maps using physical (whole body) gestures to easily perform typical basic spatial tasks within GIS (e.g. pan-, zoom- and selection-operations). We studied multi-modal interaction with large- and mid-scale displays by using multi-touch, foot and gaze input. We are interested in understanding how non-expert users interact with such multi-touch surfaces. Therefore, we provide a categorization and a framework of multi-touch hand gestures for interacting with GIS. The combination of multi-touch gestures with a small set of foot gestures to solve geospatial tasks leads to an extended framework for multi-touch and foot input. In an additional step this framework is extended again with eye gaze input.

Keywords

Geographic Information System Hand Gesture Geospatial Data Interaction Space Frustrate Total Internal Reflection 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Biedert, R., Buscher, G., Dengel, A.: The eye book. Informatik-Spektrum. 33(3), 272–281 (2009)Google Scholar
  2. 2.
    Buxton, W.: Multi-touch systems that I have known and loved. http://www.billbuxton.com/multitouchOverview.htmlAccessed Oct 2010
  3. 3.
    Buxton, W., Myers, B.: A study in two-handed input. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 321–326. ACM, New York (1986)Google Scholar
  4. 4.
    Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In: UIST ’05: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, pp. 115–118. ACM, New York (2005)Google Scholar
  5. 5.
    Hinckley, K., Pausch, R., Proffitt, D., Kassell, N.F.: Two-handed virtual manipulation. ACM Trans. Comput. Hum. Interact. 5(3), 260–302 (1998)CrossRefGoogle Scholar
  6. 6.
    Holman, D.: Gazetop: Interaction techniques for gaze-aware tabletops. In: CHI ’07: CHI ’07 Extended Abstracts on Human Factors in Computing Systems, pp. 1657–1660. ACM, New York (2007)Google Scholar
  7. 7.
    Hornecker, E.: “I don’t understand it either, but it is cool” – visitor interactions with a multi-touch table in a museum. In: Horizontal Interactive Human Computer Systems, 2008. TABLETOP 2008. 3rd IEEE International Workshop on, pp. 113–120. IEEE, New York (2008)Google Scholar
  8. 8.
    Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: CHI ’90: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 11–18. ACM, New York (1990)Google Scholar
  9. 9.
    Kaltenbrunner, M., Bovermann, T., Bencina, R., Costanza, E.: Tuio: A protocol for table-top tangible user interfaces. In: Proceedings of the 6th International Workshop on Gesture in Human-Computer Interaction and Simulation, 1 May 2006. Springer, Berlin (2005)Google Scholar
  10. 10.
    Kumar, M., Winograd, T.: Gaze-enhanced scrolling techniques. In: UIST ’07: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, pp. 213–216. ACM, New York (2007)Google Scholar
  11. 11.
    Maceachren, A., Brewer, I.: Developing a conceptual framework for visually-enabled geocollaboration. Int. J. Geogr. Inf. Sci. 18(1), 1–34 (2004)CrossRefGoogle Scholar
  12. 12.
    MultitouchJava. http://www.mt4j.org/mediawiki/index.php/Main_PageAccessed Oct 2010
  13. 13.
    Pakkanen, T., Raisamo, R.: Appropriateness of foot interaction for non-accurate spatial tasks. In: Conference on Human Factors in Computing Systems, pp. 1123–1126. ACM, New York (2004)Google Scholar
  14. 14.
    Pearson, G., Weiser, M.: Of moles and men: the design of foot controls for workstations. ACM SIGCHI Bull. 17(4), 333–339 (1986).CrossRefGoogle Scholar
  15. 15.
    Pearson, G., Weiser, M.: Exploratory evaluation of a planar foot-operated cursor-positioning device. In: CHI ’88: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 13–18. ACM, New York (1988)Google Scholar
  16. 16.
    Ryall, K., Forlines, C., Shen, C., Morris, M.R., Everitt, K.: Experiences with and observations of direct-touch tabletops. In: TABLETOP ’06: Proceedings of the First IEEE International Workshop on Horizontal Interactive Human–Computer Systems, pp. 89–96. IEEE Computer Society, Washington, DC (2006)Google Scholar
  17. 17.
    Schöning, J., Hecht, B., Raubal, M., Krüger, A., Marsh, M., Rohs, M.: Improving interaction with virtual globes through spatial thinking: Helping users ask “Why?”. In: IUI ’08: Proceedings of the 13th Annual ACM Conference on Intelligent User Interfaces. ACM, New York (2008)Google Scholar
  18. 18.
    Schöning, J., Daiber, F., Rohs, M., Krüger, A.: Using hands and feet to navigate and manipulate spatial data. In: CHI ’09: CHI ’09 Extended Abstracts on Human Factors in Computing Systems. ACM, New York (2009)Google Scholar
  19. 19.
    Sibert, L.E., Jacob, R.J.K.: Evaluation of eye gaze interaction. In: CHI ’00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 281–288. ACM, New York, NY, USA (2000)Google Scholar
  20. 20.
    Smith, J.D., Graham, T.C.N.: Use of eye movements for video game control. In: ACE ’06: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, p. 20. ACM, New York (2006)Google Scholar
  21. 21.
    Tanriverdi, V., Jacob, R.J.K.: Interacting with eye movements in virtual environments. In: CHI ’00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 265–272. ACM, New York (2000)Google Scholar
  22. 22.
  23. 23.
    UNIGIS. Guidelines for Best Practice in User Interface for GIS: ESPRIT/ESSI project no. 21580. (1998)Google Scholar
  24. 24.
    Wasinger, R., Stahl, C., Krüger, A.: M3I in a pedestrian navigation & exploration system. In: Human–Computer Interaction with Mobile Devices and Services: 5th International Symposium, Mobile Hci 2003, Udine, Italy, 8–11 September 2003: Proceedings (2003)Google Scholar
  25. 25.
    Wii Ballance Board. http://e3nin.nintendo.com/wii_fit.html. Accessed Oct 2010
  26. 26.
    Wilson, A.D., Izadi, S., Hilliges, O., Garcia-Mendoza, A., Kirk, D.: Bringing physics to the surface. In: UIST ’08: Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, pp. 67–76. ACM, New York (2008)Google Scholar
  27. 27.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: CHI ’09: Proceedings of the 27th International Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM, New York (2009)Google Scholar
  28. 28.
    World Wind Java SDK. http://worldwind.arc.nasa.gov/java/Accessed Oct 2010
  29. 29.
    Wu, M., Balakrishnan, R.: Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In: Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, pp. 193–202. ACM, New York (2003)Google Scholar
  30. 30.
    Wu, M., Shen, C., Ryall, K., Forlines, C., Balakrishnan, R.: Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. In: TABLETOP ’06: Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems, pp. 185–192. IEEE Computer Society, Washington, DC (2006)Google Scholar
  31. 31.
    Zhai, S.: Human performance in six degree of freedom input control. Ph.D. thesis, University of Toronto, Toronto (1995)Google Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

  • Florian Daiber
    • 1
  • Johannes Schöning
    • 1
  • Antonio Krüger
    • 1
  1. 1.Innovative Retail LabGerman Research Institute for Artificial Intelligence (DFKI)SaarbrückenGermany

Personalised recommendations