Advertisement

Human Error Prediction Using Eye Tracking to Improvise Team Cohesion in Human-Machine Teams

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 778)

Abstract

The rapid increase in integration of intelligent systems in every corner of technology created an emerging field called Human-Machine Teaming (HMT). In HMT, human and machine collaborate with one another to accomplish a common goal or task. To achieve the best performance of a team, it is necessary to build trust and cohesion among all teammates (machines and humans). Furthermore, in a team, it is an established fact that a team member ability to predict fellow member’s future course of action and have an accurate picture is a valuable asset and will result in better team dynamics and team performance. To realize such an ability we are proposing a human error predicting methodology that could give an intelligent system a better understanding of human actions in advance. In the proposed method, we used eye-tracking metrics such as gaze density and cognitive state to predict human errors. The results obtained with the proposed and developed methods are found to be efficient in predicting human error probability.

Keywords

Cohesion Decision trees Eye-Tracking Human-Machine Teaming Support Vector Machine 

Notes

Acknowledgments

The University of Toledo and Round 1 Award from the Ohio Federal Research Jobs Commission (OFMJC) through Ohio Federal Research Network (OFRN) fund this research project; authors also appreciate support of the Paul A. Hotmer Family CSTAR (Cybersecurity and Teaming Research) Lab and EECS (Electrical Engineering and Computer Science) Department at the University of Toledo.

References

  1. 1.
    IBM: The human-machine interchange: how intelligent automation is reconstructing business operations. Business consulting (2017) public.dhe.ibm.com/common/ssi/ecm/gb/en/gbe03879usen/human-machine-interchange.pdf
  2. 2.
    Guszcza, J., Lewis, H., Evans-Greenwood, P.: Why Humans and computers think better together. Deloitte Insights (2017). https://www2.deloitte.com/insights/us/en/deloitte-review/issue-20/augmented-intelligence-human-computer-collaboration.html
  3. 3.
    Ziemke, T., Schaefer, K.E., Endsley, M.: Situation awareness in human-machine interactive systems, pp. 1–2 (2017)CrossRefGoogle Scholar
  4. 4.
    Eddy, N.: Relationship between humans, machines must evolve: gatner. E-Week. http://www.eweek.com/small-business/relationship-between-humans-machines-must-evolve-gartner
  5. 5.
    Aponso, B.: HAT tricks: understanding human autonomy teaming through applications (2017)Google Scholar
  6. 6.
    McCourt, M.J., Mehta, S.S., Doucette, E.A., Curtis, J.W.: Human-machine teaming for effective estimation and path planning. In: Micro-and Nanotechnology Sensors, Systems, and Applications VIII, vol. 9836, p. 98361 W. International Society for Optics and Photonics (2016)Google Scholar
  7. 7.
    Mu, L., Xiao, B.P., Xue, W.K., Yuan, Z.: The prediction of human error probability based on Bayesian networks in the process of task. In: 2015 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), pp. 145–149. IEEE (2015)Google Scholar
  8. 8.
    Ramakrishnan, R., Zhang, C., Shah, J.: Perturbation training for human-robot teams. J. Artif. Intell. Res. 59, 495–541 (2017)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Yang, X.J., Unhelkar, V.V., Li, K., Shah, J.A.: Evaluating effects of user experience and system transparency on trust in automation. In: HRI, pp. 408–416 (2017)Google Scholar
  10. 10.
    Baluja, S., Dean, P.: Non-intrusive gaze tracking using artificial neural networks. In: Advances in Neural Information Processing Systems, pp. 753–760 (2014)Google Scholar
  11. 11.
    Dzeng, R.J., Lin, C.T., Fang, Y.C.: Using eye-tracker to compare search patterns between experienced and novice workers for site hazard identification. Saf. Sci. 82, 56–67 (2016)CrossRefGoogle Scholar
  12. 12.
    Goswami, A., Gursimran, W., Mark, M., Ganesh, P.: Using eye tracking to investigate reading patterns and learning styles of software requirement inspectors to enhance inspection team outcome. In: Proceedings of the 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, p. 34. ACM (2016)Google Scholar
  13. 13.
    Maguire, R.: Validating a process for understanding human error probabilities in complex human computer interfaces. Complexity in Design and Engineering, pp. 313–326 (2005)Google Scholar
  14. 14.
    Voßkühler, A., Nordmeier, V., Kuchinke, L., Jacobs, A.M.: OGAMA (open gaze and mouse analyzer): open-source software designed to analyze eye and mouse movements in slideshow study designs. Behav. Res. Methods 40(4), 1150–1162 (2008)CrossRefGoogle Scholar
  15. 15.
    Piquado, T., Isaacowitz, D., Wingfield, A.: Pupillometry as a measure of cognitive effort in younger and older adults. Psychophysiology 47(3), 560–569 (2010)CrossRefGoogle Scholar
  16. 16.
    Guenther, N., Schonlau, M.: Support vector machines. Stata. J. 16(4), 917–937 (2016)Google Scholar
  17. 17.
    Su, J., Zhang, H.: A fast decision tree learning algorithm. In: AAAI, vol. 6, pp. 500–505 (2006)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  1. 1.EECS Department, College of EngineeringThe University of ToledoToledoUSA

Personalised recommendations