Advertisement

bRIGHT – Workstations of the Future and Leveraging Contextual Models

  • Rukman SenanayakeEmail author
  • Grit Denker
  • Patrick Lincoln
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10904)

Abstract

Experimenting with futuristic computer workstation design and specifically tailored application models can yield useful insights and result in exciting ways to increase efficiency, effectiveness, and satisfaction for computer users. Designing and building a computer workstation that can track a user’s gaze; sense proximity to the touch surface; and support multi-touch, face recognition etc meant overcoming some unique technological challenges. Coupled with extensions to commonly used applications to report user interactions in a meaningful way, the workstation will allow the development of a rich contextual user model that is accurate enough to enable benefits, such as contextual filtering, task automation, contextual auto-fill, and improved understanding of team collaborations. SRI’s bRIGHT workstation was designed and built to explore these research avenues and investigate how such a context model can be built, identify the key implications in designing an application model that best serves these goals, and discover other related factors. This paper conjectures future research that would support the development of a collaborative context model that could leverage similar benefits for groups of users.

Keywords

Contextual model Cognitive model Task automation Multimodal input Gaze tracking 

References

  1. 1.
    Senanayake, R., Denker, G.: Towards more effective cyber operator interfaces through semantic modeling of user context. In: Nicholson, D. (ed.) Advances in Human Factors in Cybersecurity, vol. 501, pp. 19–31. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-41932-9_3CrossRefGoogle Scholar
  2. 2.
    Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology. ACM (2005)Google Scholar
  3. 3.
    Bulling, A., Gellersen, H.: Toward mobile eye-based human computer interaction. IEEE Pervasive Comput. 9(4), 8–12 (2010)CrossRefGoogle Scholar
  4. 4.
    Rantanen, V., Vanhala, T., Tuisku, O., Niemenlehto, P.-H., Verho, J., Surakka, V., Juhola, M., Lekkala, J.: A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction. IEEE Trans. Inf Technol. Biomed. 15(5), 795–801 (2011)CrossRefGoogle Scholar
  5. 5.
    Corcoran, P.M., Nanu, F., Petrescu, S., Bigioi, P.: Real-time eye gaze tracking for gaming design and consumer electronics systems, pp. 347–355 (2012)CrossRefGoogle Scholar
  6. 6.
    Bolmont, M., Cacioppo, J.T., Cacioppo, S.: Love is in the gaze: an eye-tracking study of love and sexual desire. Psychol. Sci. 25, 1748–1756 (2014)CrossRefGoogle Scholar
  7. 7.
    Judd, T., Ehinger, K., Torralba, A.: Learning to predict where humans look. In: ICCV, pp. 2106–2113 (2009)Google Scholar
  8. 8.
    Senju, A., Johnson, M.H.: The eye contact effect: mechanisms and development. Trends Cogn. Sci. 13(3), 127–134 (2009)CrossRefGoogle Scholar
  9. 9.
    Tylén, K., Allen, M., Hunter, B.K., Roepstorff, A.: Interaction vs. observation: distinctive modes of social cognition in human brain and behavior? A combined fMRI and eye-tracking study. Front. Hum. Neurosci. 6, 331 (2012)CrossRefGoogle Scholar
  10. 10.
    Kochukhova, O., Gredeba, G.: Preverbal infants anticipate that food will be brought to the mouth: an eye tracking study of manual feeding and flying spoons. Child Dev. 81(6), 1729–1738 (2010)CrossRefGoogle Scholar
  11. 11.
    Kano, F., Tomonaga, M.: How chimpanzees look at pictures: a comparative eye-tracking study. Proc. Biol. Sci. 276(1664), 1949–1955 (2009)CrossRefGoogle Scholar
  12. 12.
    Bergasa, L.M., Nuevo, J., Sotelo, M.A., Barea, R., Lopez, M.E.: Real-time system for monitoring driver vigilance. IEEE Trans. Intell. Transp. Syst. 7(1), 63–77 (2006)CrossRefGoogle Scholar
  13. 13.
    Qi, Y., Wang, Z., Huang, Y.: A non-contact eye-gaze tracking system for human computer interaction. In: 2007 International Conference on Wavelet Analysis Pattern Recognition, pp. 68–72, November 2007Google Scholar
  14. 14.
    Hennessey, C., Lawrence, P.: Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomed. Eng. 56(3), 790–799 (2009)CrossRefGoogle Scholar
  15. 15.
    Iqbal, N., Lee, H., Lee, S.-Y.: Smart user interface for mobile consumer devices using model-based eye-gaze estimation. IEEE Trans. Consum. Electron. 59(1), 161–166 (2013)CrossRefGoogle Scholar
  16. 16.
    Guestrin, E.D., Eizenman, M.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006)CrossRefGoogle Scholar
  17. 17.
    Hennessey, C., Noureddin, B., Lawrence, P.: Fixation precision in high-speed noncontact eye-gaze tracking. IEEE Trans. Syst. Man. Cybern. Part B (Cybern.) 38(2), 289–298 (2008)CrossRefGoogle Scholar
  18. 18.
    Nawaz, T., Mian, M., Habib, H.: Infotainment devices control by eye gaze and gesture recognition fusion. IEEE Trans. Consum. Electron. 54(2), 277–282 (2008)CrossRefGoogle Scholar
  19. 19.
    Asteriadis, S., Karpouzis, K., Kollias, S.: Visual focus of attention in non-calibrated environments using gaze estimation. Int. J. Comput. Vis. 107(3), 293–316 (2013)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. 54(12), 2246–2260 (2007)CrossRefGoogle Scholar
  21. 21.
    Xia, D., Ruan, Z.: IR image based eye gaze estimation. In: Eighth ACIS International Conference Software Engineering Artificial Intelligence Networking, Parallel/Distributed Computing (SNPD 2007), vol. 1, pp. 220–224, July 2007Google Scholar
  22. 22.
    Nguyen, Q.X., Jo, S.: Electric wheelchair control using head pose free eye-gaze tracker. Electron. Lett. 48(13), 750 (2012)CrossRefGoogle Scholar
  23. 23.
    Rae, J.P., Steptoe, W., Roberts, D.J.: Some implications of eye gaze behavior and perception for the design of immersive telecommunication systems. In: 2011 IEEE/ACM 15th International Symposium on Distributed Simulation Real Time Applications, pp. 108–114, September 2011Google Scholar
  24. 24.
    Panwar, P., Sarcar, S., Samanta, D.: EyeBoard: a fast and accurate eye gaze-based text entry system. In: 2012 4th International Conference on Intelligent Human Computer Interaction, pp. 1–8, December 2012Google Scholar
  25. 25.
    Ji, Q., Zhu, Z.: Eye and gaze tracking for interactive graphic display. In: Proceedings of the 2nd International Symposium on Smart Graphics. ACM (2002)Google Scholar
  26. 26.
    Dahlback, N., Jönsson, A., Ahrenberg, L.: Wizard of Oz studies - why and how. Knowl.-Based Syst. 6(4), 258–266 (1993)CrossRefGoogle Scholar
  27. 27.
    Cohen, P.R., Johnston, M., McGee, D., Oviatt, S.: The efficiency of multimodal interaction: a case study. In: International Conference on Spoken Language Processing (ICSLP), Sydney, Australia (1998)Google Scholar
  28. 28.
    Oviatt, S., Lunsford, R., Coulston, R.: Individual differences in multimodal integration patterns: what are they and why do they exist? In: Conference on Human Factors in Computing Systems (CHI), New York, USA (2005)Google Scholar
  29. 29.
    Ruiz, N., Chen, F., Oviatt, S.: Multimodal input. In: Thiran, J.-P., Marques, F., Bourlard, H. (eds.) Multimodal Signal Processing, pp. 231–255 (2010). Chapter 12CrossRefGoogle Scholar
  30. 30.
    Turk, M.: Multimodal interaction: a review. Pattern Recogn. Lett. 36, 189–195 (2014)CrossRefGoogle Scholar
  31. 31.
    Perzylo, A., Somani, N., Profanter, S., Rickert, M., Knoll, A.: Toward efficient robot teach-in and semantic process descriptions for small lot sizes. In: Robotics: Science and Systems (RSS), Workshop on Combining AI Reasoning and Cognitive Science with Robotics, Rome, Italy (2015)Google Scholar
  32. 32.
    Akan, B., Ameri, A., Cürüklü, B., Asplund, L.: Intuitive industrial robot programming through incremental multimodal language and augmented reality. In: International Conference on Robotics and Automation (ICRA), Shanghai, China (2011)Google Scholar
  33. 33.
    Stenmark, M., Nugues, P.: Natural language programming of industrial robots. In: International Symposium on Robotics (ISR), Seoul, Korea (2013)Google Scholar
  34. 34.
    Stenmark, M., Malec, J.: Describing constraint-based assembly tasks in unstructured natural language. In: World Congress of the International Federation of Automatic Control (IFAC) (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Rukman Senanayake
    • 1
    Email author
  • Grit Denker
    • 1
  • Patrick Lincoln
    • 1
  1. 1.SRI InternationalMenlo ParkUSA

Personalised recommendations