Abstract
The types of unreliability that an automated system may express can have effects on a user’s perception of that automation’s overall operational ability. A software program which makes one type of mistake might be judged more harshly than another program which makes a different sort of error; even if both have equal success rates. Here I use a Hidden Object Game to examine people’s different responses to a program when it appears to either miss its target objects or, alternatively, to make false alarms. Playing at both high and low clutter levels, participants who believed they were working with an automated system which missed targets decreased their trust in that automation, and judged its performance more harshly, compared to participants who believed the automation was making false alarms. Participants in the combined low clutter and miss condition showed the strongest decrease in trust. When asked to guess how often the program had been correct this group also gave it the lowest mean score. These results demonstrates that in a target detection task, automation that misses targets will be judged more harshly than automation that errs on the side of false alarms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Biros DP, Daly M (2004) Gunsch, G: The influence of task load and automation trust on deception detection. Group Decis Negot 13(2):173–189
Dixon SR, Wickens CD, Chang D (2004) Unmanned aerial vehicle flight control: false alarms versus misses. In: Proceedings of the human factors and ergonomics society annual meeting, vol. 48, no. 1. SAGE Publications, pp 152–156
Dixon SR, Wickens CD, McCarley JS (2007) On the independence of compliance and reliance: Are automation false alarms worse than misses? Hum Factors J Hum Factors Ergon Soc 49(4):564–572
Donnellan MB, Oswald FL, Baird BM, Lucas RE (2006) The mini-IPIP scales: tiny-yet-effective measures of the Big Five factors of personality. Psychol Assess 18(2):192
Drnec K, Marathe A, Lukos JS, Metcalfe JS (2016) From trust in automation to decision neuroscience: applying cognitive neuroscience methods to understand and improve interaction decisions involved in human automation interaction. Front Hum Neurosci 10:290
Hancock PA, Billings DR, Schaefer KE (2011) Can you trust your robot? Ergon Des 19(3):24–29
Lee JD, See KA (2004) Trust in automation: designing for appropriate reliance. Hum Factors J Hum Factors Ergon Soc 46(1):50–80
Madhavan P, Wiegmann DA, Lacson FC (2006) Automation failures on tasks easily performed by operators undermine trust in automated aids. Hum Factors J Hum Factors Ergon Soc 48(2):241–256
Meyer J (2001) Effects of warning validity and proximity on responses to warnings. Hum Factors J Hum Factors Ergon Soc 43(4):563–572
Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors J Hum Factors Ergon Soc 39(2):230–253
Rovira E, McGarry K, Parasuraman R (2007) Effects of imperfect automation on decision making in a simulated command and control task. Hum Factors J Hum Factors Ergon Soc 49(1):76–87
Sanders T, Oleson KE, Billings DR, Chen JY, Hancock PA (2011) A model of human-robot trust theoretical model development. In: Proceedings of the human factors and ergonomics society annual meeting, vol. 55, no. 1. SAGE Publications, pp 1432–1436
Schaefer K (2013) The perception and measurement of human-robot trust. (Doctoral Dissertation). University of Central Florida, Florid
Sparaco P (1995) Airbus seeks to keep pilot, new technology in harmony. Aviat Week Space Technol 142(5):62–63
de Visser E, Parasuraman R (2011) Adaptive aiding of human-robot teaming effects of imperfect automation on performance, trust, and clutter. J Cogn Eng Decis Making 5(2):209–231
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Kaplan, A. (2019). Trust in Imperfect Automation. In: Bagnara, S., Tartaglia, R., Albolino, S., Alexander, T., Fujita, Y. (eds) Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018). IEA 2018. Advances in Intelligent Systems and Computing, vol 824. Springer, Cham. https://doi.org/10.1007/978-3-319-96071-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-96071-5_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-96070-8
Online ISBN: 978-3-319-96071-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)