Errors: An Inherent Part of Human-System Performance

  • Frank E. Ritter
  • Gordon D. Baxter
  • Elizabeth F. Churchill


In this chapter we consider how errors contribute to accidents, large and small, and what we can do about them. We discuss the problem of post-hoc analyses, the types of human error that can occur, and how to design systems in such a way that the errors can be appropriately managed. The examples illustrate how user’s characteristics in terms of psycho-physiology, fatigue, cognitive processing, and social situations can all contribute to failures. We especially note the importance of Norman’s (and others’) main guideline about needing to design for error.


Human Performance Human Error Fault Tree Error Mode Flight Crew 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Air Accidents Investigation Branch. (1989). Report on the accident to Boeing 737-400- G-OBME near Kegworth, Leicestershire. Retrieved 8 March 2014, from
  2. Arnstein, F. (1997). Catalogue of human error. British Journal of Anaesthesia, 79, 645–656.CrossRefGoogle Scholar
  3. Bainbridge, L. (1987). Ironies of automation. In J. Rasmussen, K. Duncan, & J. Leplat (Eds.), New technology and human error (pp. 271–283). Chicester: John Wiley.Google Scholar
  4. Baxter, G. D. (2000). State misinterpretation in flight crew behaviour: An incident-based analysis. Unpublished PhD Thesis, University of Nottingham. Google Scholar
  5. Besnard, D., Greathead, D., & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of Human-Computer Studies, 60(60), 117–128.CrossRefGoogle Scholar
  6. Bogner, M. S. (Ed.). (2004). Misadventures in health care. Mahwah, NJ: Erlbaum.Google Scholar
  7. Chappell, S. L. (1994). Using voluntary incident reports for human factors evaluations. In N. Johnston, N. McDonald, & R. Fuller (Eds.), Aviation psychology in practice (pp. 149–169). Aldershot, UK: Avebury.Google Scholar
  8. Dekker, S. (2005). Ten questions about human error: A new view of human factors and system safety. Mahwah, NJ: Erlbaum.Google Scholar
  9. Dekker, S. (2007). Just culture: Balancing safety and accountability. Aldershot, Hampshire, England: Ashgate Publishing.Google Scholar
  10. Dismukes, K., Young, G., & Sumwalt, R. (1998). Cockpit interruptions and distractions: Effective management requires a careful balancing act. ASRS Directline, 10, 4–9.Google Scholar
  11. Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (2nd ed.). Cambridge, MA: MIT Press.Google Scholar
  12. Fitts, P. M. (1951). Human engineering for an effective air navigation and traffic control system. Washington, DC: National Research Council.Google Scholar
  13. Gigerenzer, G. (2004). Dread risk, September 11, and fatal traffic accidents. Psychological Science, 15(4), 286–287.CrossRefGoogle Scholar
  14. Hollnagel, E. (1993a). Human reliability analysis: Context and control. London: Academic Press.Google Scholar
  15. Hollnagel, E. (1993b). The phenotypes of erroneous actions. International Journal of Man-Machine Studies, 39, 1–32.CrossRefGoogle Scholar
  16. Hollnagel, E. (1998). Cognitive reliability and error assessment method. Oxford, UK: Elsevier Science.Google Scholar
  17. Hollnagel, E., Drøivoldsmo, A., & Kirwan, B. (1996). Practical insights from studies of operator diagnosis. In Proceedings of ECCE-8. Eighth European Conference on Cognitive Ergonomics (pp. 133–137). Granada, 8–12 Sept, 1996. Rocquencourt, France: European Association of Cognitive Ergonomics.Google Scholar
  18. Hollnagel, E., Woods, D. D., & Leveson, N. (Eds.). (2006). Resilience engineering: Concepts and precepts. Aldershot, UK: Ashton Press.Google Scholar
  19. Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.Google Scholar
  20. Johnson, C. W., & Holloway, C. M. (2007). A longitudinal analysis of the causal factors in major maritime accidents in the USA and Canada (1996–2006). In The Safety of Systems: Proceedings of the 15th Safety-Critical Systems Symposium (pp. 85–104). London, UK: Springer.Google Scholar
  21. Kemeny (chairman), J. G. (1979). The need for change: The Legacy of TMI. Washington, DC: The President’s Commission on the accident at TMI.Google Scholar
  22. Mach, E. (1905). Knowledge and error (English Trans., D. Reidel, 1976). Dordrecht, Netherlands: Reidel.Google Scholar
  23. Miguel, A., & Wright, P. (2003). CHLOE: A technique for analysing collaborative systems. In Proceedings of 9th Conference on Cognitive Science Approaches to Process Control (pp. 53–60). New York, NY: ACM Press.Google Scholar
  24. Nardi, B. A. (1996). Context and consciousness: Activity theory and human-computer interaction. Cambridge, MA: MIT Press.Google Scholar
  25. Norman, D. A. (1981). Categorization of action slips. Psychological Review, 88, 1–15.CrossRefMathSciNetGoogle Scholar
  26. Norman, D. A. (1988). The psychology of everyday things. New York, NY: Basic Books.Google Scholar
  27. Norman, D. A. (2013). The design of everyday things. New York, NY: Basic Books.Google Scholar
  28. Petroski, H. (1985/1992). To engineer is human: The role of failure in successful design. New York, NY: Vintage Books.Google Scholar
  29. Petroski, H. (1994). Design paradigms: Case histories of error and judgment in engineering. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  30. Petroski, H. (2006). Success through failure: The paradox of design. Princeton, NJ: Princeton University Press.Google Scholar
  31. Pew, R. W., Miller, D. C., & Feeher, C. E. (1981). Evaluation of proposed control room improvements through analysis of critical operator decisions (EPRI-NP-1982). Cambridge, MA: Bolt, Beranek & Newman.Google Scholar
  32. Pocock, S., Harrison, M., Wright, P., & Johnson, P. (2001). THEA: A technique for human error assessment early in design. In Proceedings of Human-Computer Interaction: INTERACT’01 (pp. 247–254). Amsterdam, The Netherlands: IOS Press.Google Scholar
  33. Randell, B. (2000). Facing up to faults. The Computer Journal, 43, 95–106.CrossRefGoogle Scholar
  34. Rasmussen, J. (1976). Outlines of a hybrid model of the process operator. In T. G. Sheridan & G. Johannsen (Eds.), Monitoring behavior and supervisory control (pp. 371–383). New York, NY: Plenum.CrossRefGoogle Scholar
  35. Rasmussen, J. (1980). What can be learned from human error reports? In K. Duncan, M. Gruneberg, & D. Wallis (Eds.), Changes in working life (pp. 97–113). Chichester, UK: Wiley.Google Scholar
  36. Rasmussen, J. (1988). Human error mechanisms in complex work environments. Reliability Engineering and System Safety, 22, 155–167.CrossRefGoogle Scholar
  37. Rasmussen, J., Pejtersen, A.-M., & Goodstein, L. P. (1994). Cognitive systems engineering. Chichester, UK: Wiley.Google Scholar
  38. Reason, J. (1990). Human error. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  39. Senders, J. W., & Moray, N. P. (1991). Human error: Cause, prediction, and reduction. Hillsdale, NJ: Erlbaum.Google Scholar
  40. Swain, A. D., & Guttman, H. E. (1983). A handbook of human reliability analysis with emphasis on nuclear power applications. Washington, DC: US Nuclear Regulatory Commission.Google Scholar
  41. van der Schaaf, T. W. (1991). A framework for designing near miss management systems. In T. W. v. d. Schaaf, D. A. Lucas & A. Hale (Eds.), Near miss reporting as a safety tool (pp. 27–35). Oxford, UK: Butterworth-Heinemann.Google Scholar
  42. Vaughan, D. (1997). The Challenger launch decision. Chicago, IL: University of Chicago Press.Google Scholar
  43. Wagenaar, W. A., & Groeneweg, J. (1987). Accidents at sea: Multiple causes and impossible consequences. International Journal of Man-Machine Studies, 27, 587–598.CrossRefGoogle Scholar
  44. Wickens, C. D., & Hollands, J. G. (2000). Engineering psychology and human performance (3rd ed.). Upper Saddle River, NJ: Prentice-Hall.Google Scholar
  45. Wiener, E., Kanki, B., & Helmreich, R. L. (Eds.). (1993). Cockpit resource management. London, UK: Academic Press.Google Scholar
  46. Woods, D. D. (1984). Some results on operator performance in emergency events. Institution of Chemical Engineers Symposium Series (Vol. 90, pp. 21–31).Google Scholar
  47. Woods, D. D., Johannesen, L. J., Cook, R. I., & Sarter, N. B. (1994). Behind human error: Cognitive systems, computers, and hindsight. Wright-Patterson Air Force Base, OH: CSERIAC. Google Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  • Frank E. Ritter
    • 1
  • Gordon D. Baxter
    • 2
  • Elizabeth F. Churchill
    • 3
  1. 1.College of ISTThe Pennsylvania State UniversityUniversity ParkUSA
  2. 2.School of Computer ScienceUniversity of St AndrewsSt AndrewsUK
  3. 3.eBay Research LabsSan JoseUSA

Personalised recommendations