Skip to main content

Is It in Your Eyes? Explorations in Using Gaze Cues for Remote Collaboration

  • Chapter
  • First Online:
Collaboration Meets Interactive Spaces

Abstract

According to previous research, head mounted displays (HMDs) and head worn cameras (HWCs) are useful for remote collaboration. These systems can be especially helpful for remote assistance on physical tasks, when a remote expert can see the workspace of the local user and provide feedback. However, a HWC often has a wide field of view and so it may be difficult to know exactly where the local user is looking. In this chapter we explore how head mounted eye-tracking can be used to convey gaze cues to a remote collaborator. We describe two prototypes developed that integrate an eye-tracker with a HWC and see-through HMD, and results from user studies conducted with the systems. Overall, we found that showing gaze cues on a shared video appears to be better than just providing the video on its own, and combining gaze and pointing cues is the most effective interface for remote collaboration among the conditions tested. We also discuss the limitations of this work and present directions for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bauer M, Kortuem G, Segall Z (1999) Where are you pointing at? A study of remote collaboration in a wearable videoconference system. In: Proceedings of ISWC, vol 99, pp 151–159

    Google Scholar 

  2. Berlo DK (1960) The process of communication. Holt, Rinehart and Winston Inc, New York

    Google Scholar 

  3. Brennan SE, Chen X, Dickinson CA, Neider MB, Zelinsky GJ (2008) Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition 106:1465–1477

    Article  Google Scholar 

  4. Brother Air Scouter. http://www.brother.co.uk/ Accessed April 10th 2016

  5. Buxton W (1992) Telepresence: Integrating shared task and person spaces. In: Proceedings of graphics interface, vol 92, pp 123–129

    Google Scholar 

  6. Carletta J, Hill RL, Nicol C, Taylor T, De Ruiter JP, Bard EG (2010) Eyetracking for two-person tasks with manipulation of a virtual world. Behav Res Methods 42(1):254–265

    Article  Google Scholar 

  7. Calvo RA, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1:18–37

    Google Scholar 

  8. Clark HH, Brennan SE (1991) Grounding in communication. In: Perspectives on socially shared cognition, vol 13, pp 127–149

    Google Scholar 

  9. Clark HH, Marshall CE (1981) Definite reference and mutual knowledge. In: Joshi AK, Webber BL, Sag IA (eds) Elements of discourse understanding. Cambridge University Press, Cambridge, pp 10–63

    Google Scholar 

  10. Daft RL, Lengel RH (1986) Organizational information requirements, media richness and structural design. Manage Sci 32(5):554–571

    Article  Google Scholar 

  11. Epson Moverio website. http://www.epson.com/moverio/ Accessed 10 April 2016

  12. Fussell SR, Kraut RE Siegel J (2000) Coordination of communication: effects of shared visual context on collaborative work. In: Proceedings of the 2000 ACM conference on Computer supported cooperative work. ACM, pp 21–30

    Google Scholar 

  13. Fussell SR, Setlock LD (2003) Using eye-tracking techniques to study collaboration on physical tasks: implications for medical research. Unpublished manuscript, Carnegie Mellon University

    Google Scholar 

  14. Fussell SR, Setlock LD, Kraut RE (2003) Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, pp 513–520

    Google Scholar 

  15. Fussell SR, Setlock LD, Yang J, Ou J, Mauer E, Kramer AD (2004) Gestures over video streams to support remote collaboration on physical tasks. Hum-Comput Interact 19(3):273–309

    Article  Google Scholar 

  16. Gergle D, Kraut RE, Fussell SR (2013) Using visual information for grounding and awareness in collaborative tasks. Hum–Comput Interact 28(1):1–39

    Google Scholar 

  17. Gauglitz S, Lee, C., Turk, M., Höllerer, T. (2012). Integrating the physical environment into mobile remote collaboration. In: Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services, pp 241–250

    Google Scholar 

  18. Gauglitz S, Nuernberger B, Turk M, Höllerer T (2014) World-stabilized annotations and virtual scene navigation for remote collaboration. In: Proceedings of the 27th annual ACM symposium on user interface software and technology. ACM, pp 449–459

    Google Scholar 

  19. Gunawardena CN, Zittle FJ (1997) Social presence as a predictor of satisfaction within a computer-mediated conferencing environment. Am J Dist Educ 11(3):8–26

    Article  Google Scholar 

  20. Google Hangout https://hangouts.google.com/ Accessed 10 April 2016

  21. Ishii H, Kobayashi M, Arita K (1994) Interactive design of seamless collaboration media. Commun ACM 37(8):83–98

    Article  Google Scholar 

  22. Kim S, Lee G, Sakata N, Billinghurst M (2014) Improving co-presence with augmented visual communication cues for sharing experience through video conference. In: 2014 IEEE International symposium on mixed and augmented reality (ISMAR). IEEE, pp. 83–92

    Google Scholar 

  23. Kraut RE, Fussell SR, Siegel J (2003) Visual information as a conversational resource in ollaborative physical tasks. Hum-Comput Interact 18(1):13–49

    Article  Google Scholar 

  24. Kurata T, Sakata N, Kourogi M, Kuzuoka H, Billinghurst M (2004) Remote collaboration using a shoulder-worn active camera/laser. In: Eighth international symposium on wearable computers, 2004. ISWC 2004, vol 1. IEEE, pp 62–69

    Google Scholar 

  25. Kuzuoka H (1992) Spatial workspace collaboration: a Shared View video support system for remote collaboration capability. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 533–540

    Google Scholar 

  26. Lanir J, Stone R, Cohen B, Gurevich P (2013) Ownership and control of point of view in remote assistance. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 2243–2252

    Google Scholar 

  27. Li J, Manavalan M, D’Angelo S, Gergle D (2016) Designing shared gaze awareness for remote collaboration. In: Proceedings of the 19th ACM conference on computer supported cooperative work and social computing companion (CSCW ‘16 Companion). ACM, New York, NY, USA, pp 325–328

    Google Scholar 

  28. Litchfield D, Ball LJ (2011) Using another’s gaze as an explicit aid to insight problem solving. Q J Exp Psychol 64(4):649–656

    Article  Google Scholar 

  29. Masai K, Sugiura Y, Ogata M, Kunze K, Inami M, Sugimoto M (2016) Facial expression recognition in daily life by embedded photo reflective sensors on smart eyewear. In: Proceedings of the 21st international conference on intelligent user interfaces. ACM, pp. 317–326

    Google Scholar 

  30. Mee A (1898) The pleasure telephone. The Strand Magazine:339–345

    Google Scholar 

  31. Müller R, Helmert JR, Pannasch S, Velichkovsky BM (2013) Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to? Q J Exp Psychol 66(7):1302–1316

    Article  Google Scholar 

  32. Noll AM (1992) Anatomy of a failure: picturephone revisited. Telecommun Policy 16(4):307–316

    Article  Google Scholar 

  33. Oda O, Sukan M, Feiner S, Tversky B (2013) 3D referencing for remote task assistance in augmented reality. In: 2013 IEEE symposium on 3D user interfaces (3DUI). IEEE, pp 179–180

    Google Scholar 

  34. Ou J, Oh LM, Fussell SR, Blum T, Yang J (2008) Predicting visual focus of attention from intention in remote collaborative tasks. IEEE Trans Multim 10(6):1034–1045

    Google Scholar 

  35. Picard RW, Picard R (1997) Affective computing, vol 252. MIT press, Cambridge

    Google Scholar 

  36. Pupil Labs website: https://pupil-labs.com/. Accessed 10 April 2016

  37. Skype website: https://www.skype.com/. Accessed 10 April 2016

  38. Tait M, Billinghurst M (2015) The effect of view independence in a collaborative AR system. Comput Support Coop Work (CSCW) 24(6):563–589

    Google Scholar 

  39. Tan CSS, Luyten K, Van Den Bergh J, Schöning J, Coninx K (2014) The role of physiological cues during remote collaboration. Presence: Teleoperat Virtual Environ 23(1):90–107

    Article  Google Scholar 

  40. Tecchia F, Alem L, Huang W (2012) 3D helping hands: a gesture based MR system for remote collaboration. In: Proceedings of the 11th ACM SIGGRAPH international conference on virtual-reality continuum and its applications in industry. ACM, pp 323–328

    Google Scholar 

  41. Velichkovsky BM (1995) Communicating attention: Gaze position transfer in cooperative problem solving. Pragmat Cogn 3:199–222

    Article  Google Scholar 

  42. Whittaker S (2003) Theories and methods in mediated communication. The handbook of discourse processes, pp 243–286

    Google Scholar 

  43. Wobbrock JO, Findlater L, Gergle D, Higgins JJ (2011) The aligned rank transform for nonparametric factorial analyses using only ANOVA procedures. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 143–146

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark Billinghurst .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Billinghurst, M. et al. (2016). Is It in Your Eyes? Explorations in Using Gaze Cues for Remote Collaboration. In: Anslow, C., Campos, P., Jorge, J. (eds) Collaboration Meets Interactive Spaces. Springer, Cham. https://doi.org/10.1007/978-3-319-45853-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-45853-3_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-45852-6

  • Online ISBN: 978-3-319-45853-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics