Abstract
An error in situational recognition may occur while driving a car, and the error can sometimes result in an ‘erroneous’ behaviour of the driver. Whether the driver assistance system can cope with such a circumstance depends on to what extent the authority is given to the system. This paper discusses the need of machine-initiated authority trading from the driver to the assistance system for assuring driver safety. A theoretical framework is also given to describe and analyze the driver’s overtrust in and overreliance on such a driver assistance system.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Billings CE (1997) Aviation automation—the search for a human-centered approach. LEA, Mahwah
Cacciabue PC (2004) Guide to applying human factors methods: human error and accident management in safety critical systems. Springer, Berlin
Convention on Road Traffic (1968) 1993 version & amendments in 2006
Endsley MR, Kiris EO (1995) The out-of-the-loop performance problem and the level of control in automation. Hum Factors 37(2):3181–3194
Hollnagel E, Bye A (2000) Principles for modeling function allocation. Int J Hum–Comput Stud 52:253–265
Inagaki T (2006) Design of human–machine interactions in light of domain-dependence of human-centered automation. Cogn Tech Work 8(3):161–167
Inagaki T (2008) Smart collaborations between humans and machines based on mutual understanding. Annu Rev Control 32:253–261
Inagaki T (2010) Traffic systems as joint cognitive systems: issues to be solved for realizing human-technology coagency. Cogn Tech Work. 12(2):153–162
Inagaki T, Stahre J (2004) Human supervision and control in engineering and music: similarities, dissimilarities, and their implications. Proc IEEE 92(4):589–600
Inagaki T, Sheridan TB (2008) Authority and responsibility in human–machine systems: Is machine-initiated trading of authority permissible in the human-centered automation framework? In: Proceedings of applied human factors and ergonomics 2008 (CD-ROM) 10pp
Inagaki T, Itoh M, Nagai Y (2007) Support by warning or by action: Which is appropriate under mismatches between driver intent and traffic conditions? IEICE Trans Fundam E90-A(11):264–272
Inagaki T, Itoh M, Nagai Y (2008) Driver support functions under resource-limited situations. J Mech Syst Transportation Logistics 1(2):213–222
Klein G (1993) A recognition-primed decision (RPD) model of rapid decision making. In: Klein G et al (eds) Decision making in action. Ablex, New York, pp 138–147
Lee JD, Moray N (1992) Trust, control strategies and allocation of function in human machine systems. Ergonomics 35(10):1243–1270
Meyer J (2001) Effects of warning validity and proximity on responses to warnings. Hum Factors 43(4):563–572
MLIT (2007) ASV; the bridge to an accident-free society. Ministry of Land, Infrastructure and Transport, Government of Japan
Mosier K, Skitka LJ, Heers S, Burdick M (1998) Automation bias: decision making and performance in high-tech cockpits. Int J Aviation Psychol 8:47–63
Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors 39(2):230–253
Parasuraman R, Molloy R, Singh IL (1993) Performance consequences of automation-induced ‘complacency’. Int J Aviation Psychol 3(1):1–23
Sarter NB, Woods DD (1995) How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Hum Factors 37(1):5–19
Sarter NB, Woods DD, Billings CE (1997) Automation surprises. In: Salvendy G (ed) Handbook of human factors and ergonomics, 2nd edn. Wiley, New York, pp 1926–1943
Sheridan TB, Parasuraman R (2005) Human–automation interaction. In: Nickerson RS (ed) Reviews of human factors and ergonomics, vol 1. Human Factors and Ergonomics Society, Santa Monica, pp 89–129
Wickens CD (1994) Designing for situation awareness and trust in automation. In: Proceedings of IFAC integrated systems engineering, pp 77–82
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Italia Srl
About this paper
Cite this paper
Inagaki, T. (2011). To What Extent May Assistance Systems Correct and Prevent ‘Erroneous’ Behaviour of the Driver?. In: Cacciabue, P., Hjälmdahl, M., Luedtke, A., Riccioli, C. (eds) Human Modelling in Assisted Transportation. Springer, Milano. https://doi.org/10.1007/978-88-470-1821-1_5
Download citation
DOI: https://doi.org/10.1007/978-88-470-1821-1_5
Published:
Publisher Name: Springer, Milano
Print ISBN: 978-88-470-1820-4
Online ISBN: 978-88-470-1821-1
eBook Packages: EngineeringEngineering (R0)