Advertisement

Simulator Studies: The Next Best Thing?

  • Erik Hollnagel
Conference paper

Abstract

The chapter describes the history of simulator studies in Human Factors research, and the roots in structural psychology and Scientific Management. Following that, the establishment and development of HAMMLAB is considered relative to the events and concerns of the early 1980s. After a short discussion of the use of simulated worlds, the changing conditions for human factors research are identified. These are the change from human–computer interaction to distributed cognition, the change from first to second generation HRA leading to the gradual irrelevance of HRA, the change human–machine systems to joint cognitive systems, the change from normal accidents to intractable systems, and the change from system safety to resilience engineering. The conclusion is that when the nature of work and the practical problems change, the methods and models should also change.

Keywords

Nuclear Power Plant Human Factor Machine System Machine Interaction Resilience Engineering 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Cacciabue PC, Hollnagel E (1995) Simulation of cognition: applications. In: Hoc JM, Cacciabue PC, Hollnagel E (eds) Expertise and technology: cognition and human-computer cooperation. Lawrence Erlbaum Associates, Mahwah, NJGoogle Scholar
  2. Clancey WJ (1991) Situated cognition: stepping out of representational flatland. AI communications. Eur J Artif Intell 4(2/3):109–112Google Scholar
  3. Cooper SE, Ramey-Smith AM, Wreathall J, Parry GW, Bley DC, Luckas WJ (1996) A technique for human error analysis (ATHEANA). Nuclear Regulatory Commission, Washington, DCCrossRefGoogle Scholar
  4. Corcoran WR., Porter NJ, Church JF, Cross MT, Guinn WM (1980) The critical safety functions and plant operation. In: IAEA international conference on current nuclear power plant safety issues, Stockholm, 20–24 October 1980Google Scholar
  5. Dougherty EM (1990) Human reliability analysis—where shouldst thou turn? Reliab Eng Syst Saf 29(3):283–299CrossRefGoogle Scholar
  6. Duncan KD, Shepherd A (1975) A simulator and training technique for diagnosing plant failures from control panels. Ergonomics 18(6):627–641CrossRefGoogle Scholar
  7. Hollnagel E (1978) Design criteria for experiments on reference situations. Risø National Laboratory, Electronics Department N-52-78 (NKA/KRU-P2(78)10), Roskilde, DenmarkGoogle Scholar
  8. Hollnagel E (1998) Cognitive reliability and error analysis method. Elsevier, OxfordGoogle Scholar
  9. Hollnagel E (2009) Extending the scope of the human factor. In: Hollnagel E (ed) Safer complex industrial environments. A human factors approach. CRC Press, Boca RatonCrossRefGoogle Scholar
  10. Hollnagel E, Marshall E (1982) The methodology of the CFMS project (HWR-077). OECD Halden Reactor Project, HaldenGoogle Scholar
  11. Hollnagel E, Speziali J (2008) Study on developments in accident investigation methods: a survey of the “state-of-the-art” (SKI 2008:50). Swedish Nuclear Inspectorate, StockholmGoogle Scholar
  12. Hollnagel E, Woods DD (1983) Cognitive systems engineering: new wine in new bottles. Int J Man Mach Stud 18:583–600CrossRefGoogle Scholar
  13. Hollnagel E, Woods DD (2005) Joint cognitive systems: foundations of cognitive systems engineering. Taylor & Francis, Boca RatonCrossRefGoogle Scholar
  14. Hollnagel E, Hunt G, Marshall E (1983) The experimental validation of the critical function monitoring system. Preliminary results of analysis (HWR-111). OECD Halden Reactor Project, HaldenGoogle Scholar
  15. Hollnagel E, Woods DD, Leveson N (2006) Resilience engineering: concepts and precepts. Ashgate, Aldershot, UKGoogle Scholar
  16. Hutchins E (1995) Cognition in the wild. MIT Press, CambridgeGoogle Scholar
  17. Klein G, Ross KG, Moon BM, Klein DE, Hoffman RR, Hollnagel E (2003) Macrocognition. IEEE Intell Syst 18(3):81–85CrossRefGoogle Scholar
  18. Le Bot P (2009) The meaning of human error in the safe regulation model for risky organizations. In: Hollnagel E (ed) Safer complex industrial environments. A human factors approach. CRC Press, Boca RatonGoogle Scholar
  19. Le Bot P, Cara F, Bieder C (1999) MERMOS, A second generation HRA method. In: Proceedings of PSA ’99. International topical meeting on probabilistic safety assessment, Washington, DCGoogle Scholar
  20. Perrow C (1984) Normal accidents: living with high risk technologies. Basic Books, Inc, New YorkGoogle Scholar
  21. Rumancik JA, Easter JR, Campbell LA (1981) Establishing goals and functions for a plant-wide disturbance analysis and surveillance system (DASS). IEEE Trans Nucl Sci 28(1):905–912CrossRefGoogle Scholar
  22. Singleton T (1974) Man-machine systems. Penguin Books, Harmondsworth, UKGoogle Scholar
  23. Stokke E (1985) HAMMLAB—establishment and initial operating experience (HWR-139). OECD Halden Reactor Project, HaldenGoogle Scholar
  24. Suchman L (1987) Plans and situated actions: the problem of human-machine communication. Cambridge University Press, New YorkGoogle Scholar
  25. Taylor FW (1911) The principles of scientific management. Harper, New YorkGoogle Scholar
  26. Woods DD, Wise JA, Hanes LF (1982) Evaluation of safety parameter display concepts (EPRI-NP-2239), vol 2. Electric Power Research Institute, Palo AltoGoogle Scholar
  27. Yoshimura S, Hollnagel E, Prætorius N (1983) Man-machine interface design using multilevel flow modelling (HWR-096). OECD Halden Reactor Project, HaldenGoogle Scholar

Copyright information

© Springer-Verlag London Limited  2010

Authors and Affiliations

  1. 1.Ècole des Mines de Paris, Centre for Research on Risk and CrisesSophia AntipolisFrance

Personalised recommendations