Abstract
Human error is frequently judged to be a primary contributor to high-consequence accidents in complex systems. This chapter explores this issue and argues that total elimination of human error is a futile pursuit. Instead, systems should be designed so that they are error tolerant in the sense that errors can occur without leading to unacceptable consequences. The idea of error tolerance is described in terms of its empirical basis and an evolving conceptual architecture for error tolerant interfaces.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Geddes, N.D. (1985). Intent inferencing using scripts and plans. Proceedings of the First Annual Aerospace Applications of Artificial Intelligence Conference.
Geddes, N.D. (1986). The use of individual differences in inferring human operator intentions. Proceedings of the Second Annual Aerospace Applications of Artificial Intelligence Conference.
Geddes, N.D. (1989). Understanding intentions of human operators in complex systems. Doctoral dissertation. Georgia Institute of Technology.
Hammer, J.M. (1984). An intelligent flight management aid for procedure execution. IEEE Transactions on Systems, Man, and Cybernetics, SMC- 14, 885–888.
Johnson, W.B., & Rouse, W.B. (1982). Analysis and classification of human errors in troubleshooting live aircraft power plants. IEEE Transactions on Systems, Man, and Cybernetics, SMC-12,389–393.
Maddox, M.E., Johnson, W.B., & Frey, P.R. (1986). Diagnostic training for nuclear plant personnel. Volume 2: Implementation and evaluation (Technical Report NP-3829, Vol. 2). Palo Alto, CA: Electric Power Research Institute.
Morris, N.M., & Rouse, W.B. (1988). Human operator response to error- likely situations in complex engineering systems (Technical Report 177484). Moffett Field, CA: National Aeronautics and Space Administration.
Norman, D.A. (1981). Categorization of action slips. Psychological Review, 88, 1–15.
Perrow, C. (1984). Normal accidents: Living with high-risk technologies. New York: Basic Books.
Rasmussen, J. (1983). Design for error tolerance. Proceedings of the Winter Meeting of the American Nuclear Society.
Rasmussen, J., Pederson, O.M., Mancini, G., Griffon, M., & Gagnolet, P. (1981). Classification system for reporting events involving human malfunctions (Technical Report RISO-M-2240). Roskilde, Denmark: Riso National Laboratory.
Rasmussen, J., & Vicente, K.J. (1987). Cognitive control of human activities and errors: Implications for ecological interface design. Proceedings of the Fourth International Conference on Event Perception and Action.
Reason, J. (1983). On the nature of mistakes. In N. Moray & J.W. Senders (Eds.), Reprints of NATO Conference on Human Error.
Reason, J., & Mycielska, K. (1982). Absent minded: The psychology of mental lapses and everyday errors. Englewood Cliffs, NJ: Prentice-Hall.
Rouse, S.H., Rouse, W.B., & Hammer, J.M. (1982). Design and evaluation of an onboard computer-based information system for aircraft. IEEE Transactions on Systems, Man, and Cybernetics, SMC-12,451–463.
Rouse, W.B. (1983). Elements of human error. In N. Moray and J.W. Senders (Eds.), Preprints of NATO Conference on Human Error.
Rouse, W.B. (1985). Optimal allocation of system development resources to reduce and/or tolerate human error. IEEE Transactions on Systems, Mant and Cybernetics, SMC-15, 620–630.
Rouse, W.B. (1987). Model-based evaluation of an integrated support system concept. Large-Scale Systems, 13, 33–42.
Rouse, W.B., Geddes, N.D., & Curry, R.E. (1988). An architecture for intelligent interfaces: Outline of an approach to supporting operators of complex systems. Human-Computer Interaction, 3,87–122.
Rouse, W.B., & Morris, N.M. (1987). Conceptual design of a human error tolerant interface for complex engineering systems. Automatica, 23, 231–235.
Rouse, W.B., & Rouse, S.H. (1983). Analysis and classification of human error. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 539–549.
van Eekhout, J.M., & Rouse, W.B. (1981). Human errors in detection, diagnosis, and compensation for failures in the engine control room of a supertanker. IEEE Transactions on Systems, Man, and Cybernetics, SMC-11, 813–816.
Wickens, C.D. (1984). Processing resources in attention. In R. Parasuraman & R. Davies (Eds.), Varieties of attention (pp. 63–101). New York: Academic.
Suggestions for Further Reading
Rasmussen, J., Duncan, K., & Leplat, J. (Eds.). (1987). New technology and human error. Chichester, UK: Wiley.
Reason, J. (1989). Causes of human error. Cambridge, UK: Cambridge University Press.
Reason, J., & Mycielska, K. (1982). Absent minded: The psychology of mental lapses and everyday errors. Englewood Cliffs, NJ: Prentice-Hall.
Rouse, W.B., & Morris, N.M. (1987). Conceptual design of a human error tolerant interface for complex engineering systems. Automatica, 23, 231–235.
Rouse, W.B., & Rouse, S.H. (1983). Analysis and classification of human error. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 539–549.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1990 Van Nostrand Reinhold
About this chapter
Cite this chapter
Rouse, W.B. (1990). Designing for Human Error: Concepts for Error Tolerant Systems. In: Booher, H.R. (eds) Manprint. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-0437-8_8
Download citation
DOI: https://doi.org/10.1007/978-94-009-0437-8_8
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-6680-8
Online ISBN: 978-94-009-0437-8
eBook Packages: Springer Book Archive