Transparent Touch – Interacting with a Multi-layered Touch-Sensitive Display System

  • Andreas KratkyEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)


Transparent Touch explores interaction with a display system with two spatially distinct layers. A transparent touch screen is overlaid in front of another information layer, which can be a 3-dimensional object or second screen. Both layers are optically aligned and offer the advantage to provide distinct semantic contexts while optically and cognitively integrating them. The system is explored in three use-case scenarios in which the transparent screen serves as an augmentation layer, as an annotation layer, and as a control layer. The concept is known from HUD displays in airplanes or cars and integrates features of augmented reality systems, mid-air interaction systems and touch screens. Our study collects an initial set of user responses.


Mid-air interaction Augmented reality Touch-screen Gestural interaction Museum interface Heads-up display Collaboration tools 


  1. 1.
    Engelbart, D.: Demo, clip 12, at ca. 33 minutes (1968).
  2. 2.
    Apple system 7 Macintosh Basics Demo.
  3. 3.
    Shneiderman, B.: Direct manipulation: a step beyond programming languages. Computer 16(8), 57–69 (1983). IEEECrossRefGoogle Scholar
  4. 4.
    Shneiderman, B.: The future of interactive systems and the emergence of direct manipulation. Behav. Inf. Technol. 1, 237–256 (2007). Taylor & Francis GroupCrossRefGoogle Scholar
  5. 5.
    Hutchins, E.L., Hollan, J.D., Norman, D.A.: Direct manipulation interfaces. Hum. Comput. Interact. 1, 311–338 (1985). L. Erlbaum Associates Inc.CrossRefGoogle Scholar
  6. 6.
    Sutherland, I.E.: Sketch pad a man-machine graphical communication system. In: DAC 1964 Proceedings of the SHARE Design Automation Workshop, pp. 6.329–6.346. ACM (1964)Google Scholar
  7. 7.
    Walker, G.: A review of technologies for sensing contact location on the surface of a display. J. Soc. Inf. Disp. 20, 413–440 (2012). Wiley-BlackwellCrossRefGoogle Scholar
  8. 8.
    Stumpe, B.B.: A New Principle for X-Y Touch Screen, CERN SPS-AOP-BS-jf, pp. i–17. CERN, Geneva (1977)Google Scholar
  9. 9.
    Hsiao, C.-Y., Liu, Y.-J., Wang, M.-J.J.: Usability evaluation of the touch screen user interface design. In: Yamamoto, S. (ed.) HCI 2013, Part I. LNCS, vol. 8016, pp. 48–54. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  10. 10.
    Kortum, P.: HCI Beyond the GUI Design for Haptic, Speech, Olfactory and Other Nontraditional Interfaces. Elsevier/Morgan Kaufmann, Amsterdam, Boston (2008)Google Scholar
  11. 11.
    Masoodian, M., McKoy, S., Rogers, B.: Hands-on sharing: collaborative document manipulation on a tabletop display using bare hands. In: CHINZ 2007 Proceedings of the 8th ACM SIGCHI New Zealand Chapter’s International Conference on Computer-Human Interaction: Design Centered HCI, pp. 25–31. ACM (2007)Google Scholar
  12. 12.
    Maciel, A., Nedel, L.P., Mesquita, E.M., Mattos, M.H., Machado, G.M., Freitas, C.M.D S.: Collaborative interaction through spatially aware moving displays. In: SAC 2010 Proceedings of the 2010 ACM Symposium on Applied Computing, pp. 1229–1233. ACM (2010)Google Scholar
  13. 13.
    Bachl, S., Tomitsch, M., Kappel, K., Grechenig, T.: The effects of personal displays and transfer techniques on collaboration strategies in multi-touch based multi-display environments. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part III. LNCS, vol. 6948, pp. 373–390. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  14. 14.
    Arroyo, E., Righi, V., Tarrago, R., Blat, J.: A remote multi-touch experience to support collaboration between remote museum visitors. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part IV. LNCS, vol. 6949, pp. 462–465. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  15. 15.
    British Museum - Augmented Reality: Beyond the Hype.
  16. 16.
  17. 17.
    Ultimate dinosaurs: Augmented reality.
  18. 18.
    Noll, C., Häussermann, B., von Jan, U., Raap, U., Albrecht, U. -V.: Demo: mobile augmented reality in medical education: an application for dermatology. In: MARS 2014 Proceedings of the 2014 Workshop on Mobile Augmented Reality and Robotic Technology-Based Systems, pp. 17–18. ACM (2014)Google Scholar
  19. 19.
    Hornecker, E.: Interactions around a contextually embedded system. In: TEI 2010 Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 169–176. ACM (2010)Google Scholar
  20. 20.
    Angelopoulou, A., Economou, D., Bouki, V., Psarrou, A., Jin, L., Pritchard, C., Kolyda, F.: Mobile augmented reality for cultural heritage. In: Venkatasubramanian, N., Getov, V., Steglich, S. (eds.) Mobilware 2011. LNICST, vol. 93, pp. 15–22. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  21. 21.
    Chun, W.H., Höllerer, T.: Real-time hand interaction for augmented reality on mobile phones. In: IUI 2013 Proceedings of the 2013 International Conference on Intelligent User Interfaces, pp. 307–314. ACM (2013)Google Scholar
  22. 22.
    Drossis, G., Grammenos, D., Birliraki, C., Stephanidis, C.: MAGIC: developing a multimedia gallery supporting mid-air gesture-based interaction and control. In: Stephanidis, C. (ed.) HCII 2013, Part I. CCIS, vol. 373, pp. 303–307. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  23. 23.
    Rakkolainen, I., Höllerer, T., DiVerdi, S., Olwal, A.: Mid-air display experiments to create novel user interfaces. Multimed. Tools Appl. 44(3), 389–405 (2009). Springer USCrossRefGoogle Scholar
  24. 24.
    Kim, H., Takahashi, I., Yamamoto, H., Kai, T., Maekawa, S., Naemura, T.: MARIO: mid-air augmented realityinteraction with objects. In: Reidsma, D., Katayose, H., Nijholt, A. (eds.) ACE 2013. LNCS, vol. 8253, pp. 560–563. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  25. 25.
    Chan, L. -W., Kao, H. -S., Chen, M. Y., Lee, M. -S., Hsu, J., Hung, Y. -P.: Touching the void: direct-touch interaction for intangible displays. In: CHI 2010 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2625–2634. ACM (2010)Google Scholar
  26. 26.
    Bruder, G., Steinicke, F., Sturzlinger, W.: To touch or not to touch?: Comparing 2D touch and 3D mid-air interaction on stereoscopic tabletop surfaces. In: SUI 2013 Proceedings of the 1st Symposium on Spatial User Interaction, pp. 9–16. ACM (2013)Google Scholar
  27. 27.
    Wilson, A.D.: TouchLight: an imaging touch screen and display for gesture-based interaction. In: ICMI 2004 Proceedings of the 6th International Conference on Multimodal Interfaces, pp. 69–76. ACM (2004)Google Scholar
  28. 28.
    Normand, J. -M., Servières, M., Moreau, G.: A new typology of augmented reality applications. In: AH 2012 Proceedings of the 3rd Augmented Human International Conference, pp. 18:1–18:8. ACM (2012)Google Scholar
  29. 29.
    Edgar, G.K.: Accommodation, cognition, and virtual image displays: a review of the literature. Displays 28(2), 45–59 (2007). Elsevier BVCrossRefGoogle Scholar
  30. 30.
    Lauber, F., Follmann, A., Butz, A.: What you see is what you touch: visualizing touch screen interaction in the head-up display. In: DIS 2014 Proceedings of the 2014 Conference on Designing Interactive Systems, pp. 171–180. ACM (2014)Google Scholar
  31. 31.
    Harrison, B.L., Ishii, H., Vicente, K.J., Buxton, W.A.S.: Transparent layered user interfaces: an evaluation of a display design to enhance focused and divided attention. In: CHI 1995 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 317–324. ACM Press/Addison-Wesley Publishing Co (1995)Google Scholar
  32. 32.
    Dürer, A.: In: Underweysung der messung, mit dem zirckel und richtscheyt, in linien, ebenen unnd gantzen corporen. Hieronymus Andreae, Nüremberg (1525)Google Scholar
  33. 33.
    Augmented reality: layar.
  34. 34.
    Hincapié-Ramos, J.D., Guo, X., Moghadasian, P., Irani, P.: Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In: CHI 2014 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1063–1072. ACM (2014)Google Scholar
  35. 35.
    Müller, J., Walter, R., Bailly, G., Nischt, M., Alt, F.: Looking glass: a field study on noticing interactivity of a shop window. In: CHI 2012 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 297–306. ACM (2012)Google Scholar
  36. 36.
    Kin, K., Hartmann, B., Agrawala, M.: Two-handed marking menus for multitouch devices. ACM Trans. Comput. Hum. Interact 18(3), 161–1623 (2011)CrossRefGoogle Scholar
  37. 37.
    Kwon, B.C., Javed, W., Elmqvist, N., Yi, J.S.: Direct manipulation through surrogate objects. In: CHI 2011 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 627–636. ACM (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Interactive Media Division, School of Cinematic ArtsUniversity of Southern CaliforniaLos AngelesUSA

Personalised recommendations