Are Teammate Trust and Confidence Dissociable in Risk Intensive Human Machine Teaming?

  • John G. BlitchEmail author
  • Anna D. Skinner
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 876)


Although automation has become the focus of an increasingly expansive body of research literature in human machine team development, few studies have examined the distinction between confidence in a teammate’s capabilities and trust in their intentions. Fewer still have examined the relationship between these two important components of reliance under naturalistic conditions of high risk. We launch into this void with an initial examination of historical case studies which suggest that risk can act as a catalyst that surprisingly and profoundly transforms the relationship between confidence and trust from a typically convergent and positive influence on teammate reliance to a divergent one that can substantially diminish it. We further examine these historical events as rare yet profound occurrences which take place outside the university/hospital laboratory environment in which the vast majority of scientific conclusions are drawn from the behavior of young college students with immature frontal lobes performing artificial tasks under emotionally sterile, risk-averse conditions. We close with the ambitious goal of inspiring a shift within the Human Machine Interaction and Cognitive Engineering fields toward naturalistic, risk intensive research with increased ecological validity for the military and first responder communities.


Human Machine Teaming Human Robot Interaction Trust Automation Robotics 



This work was sponsored by the Warfighter Interface Division of the 711th Human Performance Wing at the Air Force Research Laboratory. The authors would like to thank Dr. Robert S. Gutzwiller his sage perspective and advice regarding controversial aspects of human-machine teaming and trust.


  1. 1.
    Schaefer, K.E., et al.: A meta-analysis of factors influencing the development of trust in automation: implications for understanding autonomy in future systems. Hum. Factors 58(3), 377–400 (2016)CrossRefGoogle Scholar
  2. 2.
    Lee, J.D., Moray, N.: Trust, self-confidence, and operators’ adaptation to automation. Int. J. Hum Comput Stud. 40(1), 153 (1994)CrossRefGoogle Scholar
  3. 3.
    Strauss, G., et al.: Accuracy and precision in the evaluation of computer assisted surgical systems. A definition. HNO 54(2), 78–84 (2006)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Chen, J.Y., Barnes, M.J.: Human–agent teaming for multirobot control: a review of human factors issues. IEEE Trans. Hum. Mach. Syst. 44(1), 13–29 (2014)CrossRefGoogle Scholar
  5. 5.
    Adams, B.D.: Trust vs. Confidence. Defence Research and Development Canada-Toronto, Toronto Canada (2005)Google Scholar
  6. 6.
    Griffin, D., Tversky, A.: The weighing of evidence and the determinants of confidence. Cogn. Psychol. 24(3), 411–435 (1992)CrossRefGoogle Scholar
  7. 7.
    Petrusic, W.M., Baranski, J.V.: Judging confidence influences decision processing in comparative judgments. Psychon. Bull. Rev. 10(1), 177–183 (2003)CrossRefGoogle Scholar
  8. 8.
    Siegrist, M., Gutscher, H., Earle, T.C.: Perception of risk: the influence of general trust, and general confidence. J. Risk Res. 8(2), 145–156 (2005)CrossRefGoogle Scholar
  9. 9.
    Henrich, J., Heine, S.J., Norenzayan, A.: The weirdest people in the world? Behav. Brain Sci. 33(2–3), 61–83 (2010)CrossRefGoogle Scholar
  10. 10.
    Nielsen, M., et al.: The persistent sampling bias in developmental psychology: a call to action. J. Exp. Child Psychol. 162, 31–38 (2017)CrossRefGoogle Scholar
  11. 11.
    Lyons, J.B., et al.: Trust-based analysis of an air force collision avoidance system. Ergon. Des. 24(1), 9–12 (2016)Google Scholar
  12. 12.
    Clarke, A.C.: 2001: A Space Odyssey. Hutchinson, UK (1968)Google Scholar
  13. 13.
    Tanaka, Y.: Japan’s Kamikaze pilots and contemporary suicide bombers: war and terror. The Asia Pacific Journal: Japan Focus, 25 November 2005Google Scholar
  14. 14.
    Markoff, J.: Computer wins on ‘jeopardy!’: trivial, it’s not. New York Times, 16 February 2011Google Scholar
  15. 15.
    Rubin, A.J.: As Afghan Forces Kill, Trust Is Also a Casualty, in New York Times (2012)Google Scholar
  16. 16.
    Long, A.: ‘Green on blue’: insider attacks in Afghanistan. Survival 55(3), 167–182 (2013)CrossRefGoogle Scholar
  17. 17.
    Guttridge, L.F.: Mutiny: A History of Naval Insurrection. Naval Institute Press, Annapolis (1992)Google Scholar
  18. 18.
    Colwill, C.: Human factors in information security: the insider threat–who can you trust these days? Inf. Secur. Tech. Rep. 14(4), 186–196 (2009)CrossRefGoogle Scholar
  19. 19.
    Bluth, C.: The British road to war: Blair, Bush and the decision to invade Iraq. Int. Aff. 80(5), 871–892 (2004)CrossRefGoogle Scholar
  20. 20.
    Talwar, V., Crossman, A.M.: Children’s lies and their detection: implications for child witness testimony. Dev. Rev. 32(4), 337–359 (2012)CrossRefGoogle Scholar
  21. 21.
    Green, R.: Greater Love Hath No Man: Death and Dying in the American Military. The McNair Scholars Journal of the University of California Davis, pp. 49–57 (2013)Google Scholar
  22. 22.
    Haley, N.: State of the state address. South Carolina State Documents Depository (2015)Google Scholar
  23. 23.
    Leland, A.: Medal of Honor Recipients: 1979-2014. Congressional Research Service, Library of Congress (2005)Google Scholar
  24. 24.
    Steward, J.: Sustaining emotional resilience for school leadership. School Leadersh. Manag. 34(1), 52–68 (2014)CrossRefGoogle Scholar
  25. 25.
    Bryant, P.J.: The tort of negligence in stem laboratories (2016)Google Scholar
  26. 26.
    Gunsalus, C.K., et al.: The Illinois white paper: improving the system for protecting human subjects: counteracting IRB “mission creep”. Qual. Inq. 13(5), 617–649 (2007)CrossRefGoogle Scholar
  27. 27.
    Taleb, N.N.: The Black Swan: The Impact of the Highly Improbable, vol. 2 (2007)Google Scholar
  28. 28.
    Madhavan, P., Wiegmann, D.A.: Similarities and differences between human–human and human–automation trust: an integrative review. Theor. Issues Ergon. Sci. 8(4), 277–301 (2007)CrossRefGoogle Scholar

Copyright information

© This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2019

Authors and Affiliations

  1. 1.US Air Force AcademyEl Paso CountyUSA

Personalised recommendations