Advertisement

Verifying Automation Trust in Automated Driving System from Potential Users’ Perspective

Conference paper
  • 520 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1210)

Abstract

Trust was recognized as a key element in automation as it relates to the system safety and performance. However, automation trust established in research of automation industry may not be used directly into automated driving system. In order to identify automation trust in automated driving, a semi-structed interview was conducted based on a systematic review of automation trust with potential users. Results show that factors related to the vehicle itself, specifically its reliability, had the greatest association with trust. And it is difficult to apply automation trust theory directly to human-automated vehicle interaction as the driving task itself is dangerous and user of automated vehicle (AV) are usually not skilled operators. This study provides a new lens for conceptualizing automation trust, which can be applied to help guide future research and design procedures that enhance driver–automation cooperation.

Keywords

Trust Automation Automated driving Human-Vehicle interaction 

References

  1. 1.
    Rumar, K.: The role of perceptual and cognitive filters in observed behavior. Human Behavior and Traffic Safety. Springer, Berlin (1985)Google Scholar
  2. 2.
    Kyriakidis, M., De Winter, J.C.F., Stanton, N., Bellet, T., Van Arem, B., Brookhuis, K., et al.: A human factors perspective on automated driving. Theor. Issues Ergon. Sci. 20(3), 223–249 (2019)Google Scholar
  3. 3.
    Verberne, F.M.F., Ham, J., Midden, C.J.H.: Trust in smart systems: sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars. Hum. Factors J. Hum. Factors Ergon. Soc. 54(5), 799–810 (2012)CrossRefGoogle Scholar
  4. 4.
    Banks, V.A., Eriksson, A., O”Donoghue, J., Stanton, N.A.: Is partially automated driving a bad idea? Observations from an on-road study. Appl. Ergon. 68, 138–145 (2018)CrossRefGoogle Scholar
  5. 5.
    Hoff, K.A., Bashir, M.: Trust in automation: integrating empirical evidence on factors that influence trust. Hum. Factors J. Hum. Factors Ergon. Soc. 57(3), 407–434 (2015)CrossRefGoogle Scholar
  6. 6.
    Mirchi, T., Vu, K.P., Miles, J., Sturre, L., Curtis, S., Strybel, T.Z.: Air traffic controller trust in automation in NextGen. Procedia Manuf. 3, 2482–2488 (2015)CrossRefGoogle Scholar
  7. 7.
    Parasuraman, R., Riley, V.: Humans and automation: use, misuse, disuse, abuse. Hum. Factors J. Hum. Factors Ergon. Soc. 39(2), 230–253 (1997)CrossRefGoogle Scholar
  8. 8.
    Wicks, A.C., Berman, S.L., Jones, T.M.: The structure of optimal trust: moral and strategic. Acad. Manag. Rev. 24, 99–116 (1999)CrossRefGoogle Scholar
  9. 9.
    Lee, J.D., Moray, N.: Trust, self-confidence, and operators adaptation to automation. Int. J. Hum. Comput. Stud. 40(1), 153–184 (1994)CrossRefGoogle Scholar
  10. 10.
    Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors J. Hum. Factors Ergon. Soc. 46(1), 50–80 (2004)CrossRefGoogle Scholar
  11. 11.
    Pak, R., Fink, N., Price, M., Bass, B., Sturre, L.: Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 55(9), 1059–1072 (2012)CrossRefGoogle Scholar
  12. 12.
    Stokes, C.K., Lyons, J.B., Littlejohn, K., Natarian, J., Speranza, N.: Accounting for the human in cyberspace: effects of mood on trust in automation. In: Proceedings of The 2010 International Symposium on Collaborative Technologies and Systems (CTS 2010), IEEEGoogle Scholar
  13. 13.
    Szalma, J.L., Taylor, G.S.: Individual differences in response to automation the five factor model of personality. J. Exp. Psychol. Appl. 17(2), 71–96 (2011)CrossRefGoogle Scholar
  14. 14.
    Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y.C., De Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum Factors J. Hum Factors Ergon. Soc. 53(5), 517–527 (2011)CrossRefGoogle Scholar
  15. 15.
    Schaefer, K.E., Chen, J.Y.C., Szalma, J.L., Hancock, P.A.: A meta-analysis of factors influencing the development of trust in automation: implications for understanding autonomy in future systems. Hum. Factors J. Hum. Factors Ergon. Soc. 58(3), 377–400 (2016)CrossRefGoogle Scholar
  16. 16.
    Merritt, S.M., Ilgen, D.R.: Not all trust is created equal: dispositional and history-based trust in human-automation interactions. Hum. Factors J. Hum. Factors Ergon. Soc. 50(2), 194–210 (2008)CrossRefGoogle Scholar
  17. 17.
    Beggiato, M., Krems, J.F.: The evolution of mental model, trust and acceptance of adaptive cruise control in relation to initial information. Transp. Res. Part F. Traffic Psychol. Behav. 18, 47–57 (2013)CrossRefGoogle Scholar
  18. 18.
    Dzindolet, M.T., Peterson, S.A., Pomranky, R.A., Pierce, L.G., Beck, H.P.: The role of trust in automation reliance. Int. J. Hum Comput Stud. 58(6), 697–718 (2003)CrossRefGoogle Scholar
  19. 19.
    Wang, L., Jamieson, G.A., Hollands, J.G.: Trust and reliance on an automated combat identification system. Hum. Factors J. Hum. Factors Ergon. Soc. 51(3), 281–291 (2009)CrossRefGoogle Scholar
  20. 20.
    Ben-Ner, A., Halldorsson, F.: Trusting and trustworthiness: what are they, how to measure them, and what affects them. J. Econ. Psychol. 31(1), 64–79 (2010)CrossRefGoogle Scholar
  21. 21.
    Chen, J.Y.C., Terrence, P.I.: Effects of imperfect automation and individual differences on concurrent performance of military and robotics tasks in a simulated multitasking environment. Ergonomics 52, 907–920 (2009)CrossRefGoogle Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.College of Design and InnovationTongji UniversityShanghaiChina

Personalised recommendations