Skip to main content
Log in

SoTCM: a scene-oriented task complexity metric for gaze-supported teleoperation tasks

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

Recent developments in human–robot interaction (HRI) research have heightened the need to incorporate indirect human signals to implicitly facilitate intuitive human–guided interactions. Eye-gaze has been widely used nowadays as an input interface in multi-modal teleoperation scenarios due to their advantage in revealing human intentions and forthcoming actions. However, to date, there has been no discussion about how the structure of the environment, that the human is interacting with, could affect the complexity of the teleoperation task. In this paper, a new metric named “Scene-oriented Task Complexity Metric” (SoTCM) is proposed to estimate the complexity of a certain scene that is involved in eye-gaze-supported teleoperation tasks. The proposed SoTCM objectively estimates the effort that could be exerted by the human operator in terms of the expected time required to point at all the informative locations retrieved from the scene under discussion. The developed SoTCM depends on both the density and distribution of the informative locations in the scene, while incorporates the eye movement behavior found in the psychology literature. The proposed SoTCM is subjectively validated by using the time-to-complete index in addition to the standard (NASA-TLX) workload measure in eight varying structure scenes. Results confirmed a significant relation between SoTCM and the measured task workload which endorses the applicability of using SoTCM in predicting scene complexities and subsequently the task workload in advance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. www.ros.org.

  2. www.theeyetribe.com.

References

  1. Anderson RJ (1996) Autonomous, teleoperated, and shared control of robot systems. In: Proceedings of the 1996 IEEE international conference on robotics and automation, 1996, IEEE, vol 3, pp 2025–2032

  2. Aronson RM, Santini T, Kübler TC, Kasneci E, Srinivasa S, Admoni H (2018) Eye-hand behavior in human-robot shared manipulation. In: Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, ACM, pp 4–13

  3. Bonev B, Chuang L, Escolano F (2013) How do image complexity, task demands and looking biases influence human gaze behavior? Pattern Recogn Lett 34(7):723–730

    Article  Google Scholar 

  4. Bruce V, Green PR, Georgeson MA (2003) Visual perception: physiology, psychology, and ecology. Psychology Press, London

    Google Scholar 

  5. Carpenter RH (1988) Movements of the eyes, 2nd edn. Pion Limited, London

    Google Scholar 

  6. Castaldo R, Montesinos L, Wan TS, Serban A, Massaro S, Pecchia L (2017) Heart rate variability analysis and performance during a repeated mental workload task. In: EMBEC & NBC 2017, Springer, Berlin, pp 69–72

  7. Cheng M, Mitra NJ, Huang X, Torr PH, Hu S (2015) Global contrast based salient region detection. IEEE Trans Pattern Anal Mach Intell 37(3):569–582

    Article  Google Scholar 

  8. Dautenhahn K (2007) Methodology & themes of human-robot interaction: a growing research field. Int J Adv Rob Syst 4(1):15

    Article  Google Scholar 

  9. De Waard D (1996) The measurement of drivers’ mental workload. Groningen University, Traffic Research Center, Groningen

    Google Scholar 

  10. Donderi DC (2006) Visual complexity: a review. Psychol Bull 132(1):73

    Article  Google Scholar 

  11. Dragan AD, Srinivasa SS (2013) A policy-blending formalism for shared control. Int J Robot Res 32(7):790–805

    Article  Google Scholar 

  12. Drewes H (2010) Eye gaze tracking for human computer interaction. Ph.D. thesis, lmu

  13. El-Hussieny H, Assal SF, Abouelsoud A, Megahed SM (2015) A novel intention prediction strategy for a shared control tele-manipulation system in unknown environments. In: 2015 IEEE international conference on mechatronics (ICM), IEEE, pp 204–209

  14. Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381

    Article  Google Scholar 

  15. Freixenet J, Muñoz X, Raba D, Martí J, Cufí X (2002) Yet another survey on image segmentation: region and boundary information integration. In: Computer VisionECCV 2002, Springer, Berlin, pp 408–422

  16. Fu KS, Mui J (1981) A survey on image segmentation. Pattern Recogn 13(1):3–16

    Article  MathSciNet  Google Scholar 

  17. Gomes J, Marques F, Lourenço A, Mendonça R, Santana P, Barata J (2016) Gaze-directed telemetry in high latency wireless communications: the case of robot teleoperation. In: IECON 2016-42nd annual conference of the IEEE industrial electronics society, IEEE, pp 704–709

  18. Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applications, ACM, pp 27–34

  19. Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol 52:139–183

    Article  Google Scholar 

  20. Hayhoe M, Ballard D (2005) Eye movements in natural behavior. Trends Cognit Sci 9(4):188–194

    Article  Google Scholar 

  21. Heyer C (2010) Human-robot interaction and future industrial robotics applications. In: 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 4749–4754. IEEE

  22. Hou X, Zhang L (2007) Saliency detection: a spectral residual approach. In: IEEE conference on computer vision and pattern recognition, 2007. CVPR’07, IEEE, pp 1–8

  23. Jacob RJ (1995) Eye tracking in advanced interface design. Virtual Environ Adv Interface Des pp 258–288

  24. Jorna PG (1992) Spectral analysis of heart rate and psychological state: a review of its validity as a workload index. Biol Psychol 34(2):237–257

    Article  Google Scholar 

  25. Kim DJ, Hazlett-Knudsen R, Culver-Godfrey H, Rucks G, Cunningham T, Portee D, Bricout J, Wang Z, Behal A (2012) How autonomy impacts performance and satisfaction: results from a study with spinal cord injured subjects using an assistive robot. IEEE Trans Syst, Man Cybernet, Part A: Syst Hum 42(1):2–14

    Article  Google Scholar 

  26. Kramer J, Burrus N, Echtler F, Daniel H.C, Parker M (2012) Object modeling and detection. In: Hacking the Kinect, Springer, Berlin, pp 173–206

  27. Latif HO, Sherkat N, Lotfi A (2009) Teleoperation through eye gaze (telegaze): a multimodal approach. In: 2009 IEEE international conference on robotics and biomimetics (ROBIO), IEEE, pp 711–716

  28. Liu P, Li Z (2011) Toward understanding the relationship between task complexity and task performance. In: Internationalization, design and global development, pp 192–200

  29. MacKenzie IS (1992) Fitts’ law as a research and design tool in human–computer interaction. Hum–Comput Interact 7(1):91–139

    Article  MathSciNet  Google Scholar 

  30. Marquart G, Cabrall C, de Winter J (2015) Review of eye-related measures of drivers mental workload. Proced Manuf 3:2854–2861

    Article  Google Scholar 

  31. Niemeyer G, Preusche C, Hirzinger G (2008) Telerobotics. In: Springer handbook of robotics, Springer, Berlin, pp 741–757

  32. Rouse WB, Rouse SH (1979) Measures of complexity of fault diagnosis tasks. IEEE Trans Syst Man Cybernet 9(11):720–727

    Article  Google Scholar 

  33. Rubio S, Díaz E, Martín J, Puente JM (2004) Evaluation of subjective mental workload: a comparison of swat, nasa-tlx, and workload profile methods. Appl Psychol 53(1):61–86

    Article  Google Scholar 

  34. Rusu RB, Cousins S (2011) 3d is here: point cloud library (pcl). In: 2011 IEEE international conference on robotics and automation (ICRA), IEEE, pp 1–4

  35. Saeb S, Weber C, Triesch J (2011) Learning the optimal control of coordinated eye and head movements. PLoS Comput Biol 7(11):e1002253

    Article  Google Scholar 

  36. Schwab DP, Cummings L (1976) A theoretical analysis of the impact of task scope on employee performance. Acad Manag Rev 1(2):23–35

    Article  Google Scholar 

  37. Sibert LE, Jacob RJ (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, pp 281–288

  38. Vertegaal R (2008) A fitts law comparison of eye tracking and manual input in the selection of visual targets. In: Proceedings of the 10th international conference on multimodal interfaces, ACM, New York, pp 241–248

  39. Wolfe JM, Horowitz TS (2004) What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci 5(6):495–501

    Article  Google Scholar 

  40. Xiao L, Hung E (2007) An efficient distance calculation method for uncertain objects. In: CIDM 2007. IEEE symposium on computational intelligence and data mining, 2007, IEEE, pp 10–17

  41. You E, Hauser K (2012) Assisted teleoperation strategies for aggressively controlling a robot arm with 2d input. In: Robotics: science and systems, vol 7, p 354

  42. Zhang L, Tong MH, Marks TK, Shan H, Cottrell WG (2008) Sun: a bayesian framework for saliency using natural statistics. J Vis 8(7):32–32

    Article  Google Scholar 

  43. Zhang X, MacKenzie IS (2007) Evaluating eye tracking with iso 9241-part 9. In: HCI intelligent multimodal interaction environments human–computer interaction, Springer, Berlin, pp 779–788

  44. Zijlstra FRH (1993) Efficiency in work behaviour: a design approach for modern tools. Delft University of Technology, Delft

    Google Scholar 

Download references

Acknowledgements

This research was partially supported by the Civil-Military Technology Cooperation Program (15-CM-RB-09) and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. NRF- 2016R1E1A1A02921594). The first author is also supported by a Post-doctoral fellowship from Korea University of Education and Technology (KOREATECH) which is gratefully acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haitham El-Hussieny.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

El-Hussieny, H., Assal, S.F.M. & Ryu, JH. SoTCM: a scene-oriented task complexity metric for gaze-supported teleoperation tasks. Intel Serv Robotics 11, 279–288 (2018). https://doi.org/10.1007/s11370-018-0253-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-018-0253-1

Keywords

Navigation