Advertisement

The Impact of Type and Level of Automation on Situation Awareness and Performance in Human-Robot Interaction

  • David Schuster
  • Florian Jentsch
  • Thomas Fincannon
  • Scott Ososky
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8019)

Abstract

In highly autonomous robotic systems, human operators are able to attend to their own, separate tasks, rather than directly operating the robot to accomplish their immediate task(s). At the same time, as operators attend to their own, separate tasks that do not directly involve the robotic system, they can end up lacking situation awareness (SA) when called on to recover from automation failure or from an unexpected event. In this paper, we describe the mechanisms of this problem, known as the out-of-the-loop performance problem, and describe why the problem may still exist in future robotic systems. Existing solutions to the problem, which focus on the level of automation, are reviewed. We describe our current empirical work, which aims to expand upon taxonomies of levels of automation to better understand how engineers of robotic systems may mitigate the problem.

Keywords

Human-robot interaction robot design situation awareness automation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cosenzo, K., Parasuraman, R., De Visser, E.: Automation Strategies for Facilitating Human Interaction with Military Unmanned Vehicles. In: Barnes, M., Jentsch, F. (eds.) Human-Robot Interactions in Future Military Operations, pp. 103–124. Ashgate, Surrey (2010)Google Scholar
  2. 2.
    Burke, J.L., Murphy, R.R., Coovert, M.D., Riddle, D.L.: Moonlight in Miami: A Field Study of Human-Robot Interaction in the Context of an Urban Search and Rescue Disaster Response Training Exercise. Human-Computer Interaction 19, 85–116 (2004)CrossRefGoogle Scholar
  3. 3.
    Norman, D.A.: The ’Problem’ with Automation: Inappropriate Feedback and Interaction, not ’Over-Automation’. Philosophical Transactions of the Royal Society of London, Series B, Biological Sciences 327(1241), 585–593 (1990)CrossRefGoogle Scholar
  4. 4.
    Wiener, E.L.: Beyond the Sterile Cockpit. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 27(1), 75–90 (1985)Google Scholar
  5. 5.
    Sarter, N.B.: Strong, Silent, and Out-of-the-Loop: Properties of Advanced (Cockpit) Automation and their Impact on Human-Automation Interaction. Unpublished doctoral dissertation, Ohio State University, Columbus, OH (1994)Google Scholar
  6. 6.
    Parasuraman, R., Molloy, R., Singh, I.L.: Performance Consequences of Automation-induced “complacency”. International Journal of Aviation Psychology 3(1), 1–23 (1993)CrossRefGoogle Scholar
  7. 7.
    Endsley, M.R., Kiris, E.O.: The Out-of-the-Loop Performance Problem and Level of Control in Automation. Human Factors 37, 381–394 (1995)CrossRefGoogle Scholar
  8. 8.
    Endsley, M.R.: Direct Measurement of Situation Awareness: Validity and Use of SAGAT. In: Endsley, M.R., Garland, D.J. (eds.) Situation Awareness Analysis and Measurement, pp. 147–173. Lawrence Erlbaum Associates, Mahwah (2000)Google Scholar
  9. 9.
    Wickens, C.D.: Spatial Awareness Biases. Technical report. University of Illinois at Urbana-Champaign, Savoy, Illinois (2002)Google Scholar
  10. 10.
    Endsley, M.R., Kiris, E.O.: The Out-of-the-Loop Performance Problem and Level of Control in Automation. Human Factors 37, 381–394 (1995)CrossRefGoogle Scholar
  11. 11.
    Stancliff, S., Dolan, J.M., Trebi-Ollennu, A.: Towards a Predictive Model of Robot Reliability. Technical report, Carnegie Mellon University (2005)Google Scholar
  12. 12.
    Eski, I., Erkaya, S., Savas, S., Yildirim, S.: Fault Detection on Robot Manipulators Using Artificial Neural Networks. Robotics and Computer-Integrated Manufacturing 27, 115–123 (2010)CrossRefGoogle Scholar
  13. 13.
    Sierhuis, M., Bradshaw, J.M., Acquisti, A., Van Hoof, R., Jeffers, R., Uszok, A.: Human-Agent Teamwork and Adjustable Autonomy in Practice. In: Seventh International Symposium on Artificial Intelligence, Robotics and Automation in Space, Nara, Japan (2003)Google Scholar
  14. 14.
    U.S. Department of Defense: Autonomy in Weapon Systems. Directive Number 3000.09. Department of Defense, Washington, DC (2012)Google Scholar
  15. 15.
  16. 16.
    United States Department of Defense Department of Defense Modeling and Simulation glossary. Directive No. 5000.59-M. United States Department of Defense, Washington, DC (1998)Google Scholar
  17. 17.
    Franklin, S., Graesser, A.: Is It An Agent or Just a Program? A Taxonomy for Autonomous Agents. In: Jennings, N.R., Wooldridge, M.J., Müller, J.P. (eds.) ECAI-WS 1996 and ATAL 1996. LNCS, vol. 1193, pp. 21–35. Springer, Heidelberg (1997)CrossRefGoogle Scholar
  18. 18.
    Luck, M., D’lnverno, M.: A Formal Framework for Agency and Autonomy. In: Proceedings of the First International Conference on Multiagent Systems, vol. 1, pp. 254–260. AAAI Press/MIT Press, San Francisco, CA (1995)Google Scholar
  19. 19.
    Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A Model for Types and Levels of Human Interaction with Automation. IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans 30(3), 287 (2000)Google Scholar
  20. 20.
    Sheridan, T.B., Verplank, W.: Human and Computer Control of Undersea Teleoperators. Technical report, Man-Machine Systems Laboratory, Cambridge, MA (1978)Google Scholar
  21. 21.
    Endsley, M.R.: Situation Awareness Global Assessment Technique (SAGAT). IEEE 1988 National Aerospace and Electronics Conference 3, 789–795 (1988)CrossRefGoogle Scholar
  22. 22.
    Horrey, W.J., Wickens, C.D., Strauss, R., Kirlik, A., Stewart, T.R.: Supporting Situation Asessment through Attention Guidance and Diagnostic Aiding: The Benefits and Costs of Display Enhancement on Judgment Skill. Oxford University Press, Oxford (2006)Google Scholar
  23. 23.
    Wickens, C.D., Dixon, S.R.: The Benefits of Imperfect Diagnostic Automation: A Synthesis of the Literature. Theoretical Issues in Ergonomics Science 8(3), 201–212 (2007)CrossRefGoogle Scholar
  24. 24.
    Dexter, F., Willemsen-Dunlap, A., Lee, J.D.: Operating Room Managerial Decision-Making on the Day of Surgery With and Without Computer Recommendations and Status Displays. Anesthesia & Analgesia 105(2), 419–429 (2007)CrossRefGoogle Scholar
  25. 25.
    Goodrich, M.A., McLain, T.W., Anderson, J.D., Sun, J., Crandall, J.W.: Managing Autonomy in Robot Teams: Observations from Four Experiments. In: ACM International Conference on Human-Robot Interaction, pp. 25–32 (2007)Google Scholar
  26. 26.
    Manzey, D., Reichenbach, J., Onnasch, L.: Human Performance Consequences of Automated Decision Aids: The Impact of Degree of Automation and System Experience. Journal of Cognitive Engineering and Decision Making 6(1), 57–87 (2012)CrossRefGoogle Scholar
  27. 27.
    Horrey, W.J., Wickens, C.D.: Supporting Battlefield Aituation Assessment Through Attention Guidance and Diagnostic Aiding: A Cost-Benefit and Depth of Processing Analysis. Technical report. Aviation Research Lab, Savoy, IL (2001)Google Scholar
  28. 28.
    Madhavan, P., Phillips, R.R.: Effects of Computer Self-Efficacy and System Reliability on User Interaction with Decision Support Systems. Computers in Human Behavior 26(2), 199–204 (2010)CrossRefGoogle Scholar
  29. 29.
    Rovira, E., McGarry, K., Parasuraman, R.: Effects of Imperfect Automation on Decision Making in a Simulated Command and Control Task. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 49(1), 76–87 (2007)CrossRefGoogle Scholar
  30. 30.
    Sarter, N.B., Schroeder, B.: Supporting Decision Making and Action Selection under Time Pressure and Uncertainty: The Case of In-flight Icing. Human Factors 43(4), 573–583 (2001)CrossRefGoogle Scholar
  31. 31.
    Crocoll, W.M., Coury, B.G.: Status or Recommendation: Selecting the Type of Information for Decision Aiding. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 34(19), 1524–1528 (1990)CrossRefGoogle Scholar
  32. 32.
    Skitka, L.J., Mosier, K.L., Burdick, M.: Does Automation Bias Decision-Making? International Journal of Human-Computer Studies 51, 991–1006 (1999)CrossRefGoogle Scholar
  33. 33.
    Parasuraman, R., Wickens, C.D.: Humans: Still Vital After All These Years of Automation. Human Factors: The Journal of the Human Factors and Ergonomics Society 50(3), 514 (2008)Google Scholar
  34. 34.
    Galster, S.M., Bolia, R.S., Parasuraman, R.: The Application of a Qualitative Model of Human-Interaction with Automation: Effects of Unreliable Automation on Performance. In: Tielemans, W. (chair) The Role of Humans in Intelligent and Automated Systems. North Atlantic Treaty Organization, Warsaw (2002)Google Scholar
  35. 35.
    Johnson, R.C., Saboe, K.N., Prewett, M.S., Coovert, M.D., Elliott, L.R.: Autonomy and automation reliability in human-robot interaction: A qualitative review. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 53, 1398–1402 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • David Schuster
    • 1
  • Florian Jentsch
    • 1
  • Thomas Fincannon
    • 1
  • Scott Ososky
    • 1
  1. 1.Institute for Simulation and TrainingUniversity of Central FloridaOrlandoUSA

Personalised recommendations