Advertisement

Parts and Wholes: Scenarios and Simulators for Human Performance Studies

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 778)

Abstract

As tools like full-scale simulators and microworlds become more readily available to researchers, a fundamental question remains the extent to which full scenarios and simulators are necessary for valid and generalizable results. In this paper, we explore the continuum of scenarios and simulators and evaluate the advantages and disadvantages of each for human performance studies. The types of scenarios presented to participants may range from microtasks to complex multi-step scenarios. Microtasks usually involve only brief exposure to the human-system interface but may thereby facilitate ready data collection through repeated trials. In contrast, full scenarios present a sequence of actions that may require an extended period of time. The tradeoffs center on the fidelity of the situations and the requirements for the type of human performance data to be collected. The type of simulator presented to participants may range from a part-task simulator, to a simplified microworld, or to a full-scope high-fidelity simulator. The simplified simulators present greater opportunity for control but lose much of the context of real-world use found in full-scope simulators. We frame scenarios and simulators in the context of micro- vs. macro-cognition and provide examples of how the different experimental design choices lend themselves to different types of studies.

Keywords

Microworld Full-scope simulator Microtask Scenario Operator-in-the-loop study 

Notes

Disclaimer

The opinions expressed in this paper are entirely those of the authors and do not represent official position. This work of authorship was prepared as an account of work sponsored by Idaho National Laboratory, an agency of the United States Government. Neither the United States Government, nor any agency thereof, nor any of their employees makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately-owned rights. Idaho National Laboratory is a multi-program laboratory operated by Battelle Energy Alliance LLC, for the United States Department of Energy under Contract DE-AC07-05ID14517. This research was funded through the Laboratory Directed Research and Development program at Idaho National Laboratory.

References

  1. 1.
    Boring, R.L., Jung, W., Lau, N., Skraaning Jr., G.: How to run a control room simulator research study. In: 9th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1560–1568 (2015)Google Scholar
  2. 2.
    Boring, R., Agarwal, V.: Beyond integrated system validations: use of a control room training simulator for proof-of-concept interface development. In: 8th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1389–1399 (2012)Google Scholar
  3. 3.
    O’Hara, J., Stubler, W., Higgins, J., Brown, W.: Integrated System Validation: Methodology and Review Criteria, NUREG/CR6393. U.S. Nuclear Regulatory Commission, Washington, DC (1995)Google Scholar
  4. 4.
    International Standards Organization: Ergonomics of Human-System Interaction—Part 210: Human Centred Design for Interactive Systems, ISO 9241-210, Geneva (2010)Google Scholar
  5. 5.
    American Nuclear Society: Nuclear Power Plant Simulators for Use in Operator Training and Examination, ANSI/ANS-3.5-2009, La Grange Park, Illinois (2009)Google Scholar
  6. 6.
    Boring, R.L.: The use of simulators in human factors studies within the nuclear industry. In: Simulator-Based Human Factors Studies Across 25 Years, pp. 3–17, Springer, London (2011)CrossRefGoogle Scholar
  7. 7.
    Boring, R.L.: Lessons learned using a full-scale glasstop simulator for control room modernization in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1712–1716 (2013)CrossRefGoogle Scholar
  8. 8.
    Boring, R., Agarwal, V., Fitzgerald, K., Hugo, J., Hallbert, B.: Digital Full-Scope Simulation of a Conventional Nuclear Power Plant Control Room, Phase 2: Installation of a Reconfigurable Simulator to Support Nuclear Plant Sustainability, INL/EXT-13-28432. Idaho National Laboratory, Idaho Falls (2013)CrossRefGoogle Scholar
  9. 9.
    Naweed, A., Hockey, G.R.J., Clarke, S.D.: Designing simulator tools for rail research: the case study of a train driving microworld. Appl. Ergon. 44, 445–454 (2013)CrossRefGoogle Scholar
  10. 10.
    Ulrich, T., Boring, R., Lew, R.: Extrapolating nuclear process control microworld simulation performance data from novices to experts—a preliminary analysis. Advances in Intelligent Systems and Computing (2018, in press)Google Scholar
  11. 11.
    Lew, R., Boring, R.L., Ulrich, T.A.: A prototyping environment for research on human-machine interfaces in process control. In: Seventh International Symposium on Resilient Control Systems (2014)Google Scholar
  12. 12.
    Ulrich, T.A., Lew, R., Werner, S., Boring, R.L: Rancor: a gamified microworld nuclear power plant simulation for engineering psychology research and process control applications. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 398–402 (2017)CrossRefGoogle Scholar
  13. 13.
    Boring, R.L., Ulrich, T.A., Joe, J.C., Lew, R.T.: Guideline for operational nuclear usability and knowledge elicitation (GONUKE). Procedia Manuf. 3, 1327–1334 (2015)CrossRefGoogle Scholar
  14. 14.
    Leis, R., Reinerman-Jones, L.E., Sollins, B., Barber, D.J., Mercado, J.: Workload from nuclear power plant task types across repeated sessions. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 210–214 (2014)CrossRefGoogle Scholar
  15. 15.
    Ulrich, T.A., Werner, S., Boring, R.L.: Studying situation awareness on a shoestring budget: an example of an inexpensive simulation environment for theoretical research. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1520–1524 (2015)CrossRefGoogle Scholar
  16. 16.
    Boring, R.L., Joe, J.C., Ulrich, T.A., Lew, R.T.: Early-stage design and evaluation for nuclear power plant control room upgrades. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1909–1913 (2014)CrossRefGoogle Scholar
  17. 17.
    Boring, R.L., Lew, R., Ulrich, T.A.: Epistemiation: an approach for knowledge elicitation of expert users during product design. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1699–1703 (2016)CrossRefGoogle Scholar
  18. 18.
    Boring, R.L.: Envy in V&V: an opinion piece on new directions for verification and validation in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1746–1750 (2015)CrossRefGoogle Scholar
  19. 19.
    Boring, R.L., Griffith, C.D., Joe, J.C.: The measure of human error: direct and indirect performance shaping factors, official. In: Proceedings of the Joint 8th IEEE Conference on Human Factors and Power Plants and the 13th Annual Workshop on Human Performance/Root Cause/Trending/Operating Experience/Self Assessment, pp. 170–176 (2007)Google Scholar
  20. 20.
    Hildebrandt, M., Fernandes, A.: Micro task evaluation of analog vs. digital power plant control room interfaces. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1349–1353 (2016)CrossRefGoogle Scholar
  21. 21.
    Boring, R.L., Lau, N.: Measurement sufficiency versus completeness: integrating safety cases into verification and validation in nuclear control room modernization. Advances in Intelligent Systems and Computing, vol. 495, pp. 79–90 (2016)Google Scholar
  22. 22.
    Cacciabue, P.C., Hollnagel, E.: Simulation of Cognition: Applications, Expertise and Technology. Lawrence Erlbaum Associates, Hillsdale (1995)Google Scholar
  23. 23.
    Hutchins, E.: Cognition in the Wild. MIT Press, Cambridge (1995)Google Scholar
  24. 24.
    Hickling, E.M., Bowie, J.E.: Applicability of human reliability assessment methods to human-computer interfaces. Cog. Tech. Work 15, 19–27 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature (outside the USA) 2019

Authors and Affiliations

  1. 1.Idaho National LaboratoryIdaho FallsUSA
  2. 2.University of IdahoMoscowUSA
  3. 3.NTNU Social ResearchTrondheimNorway

Personalised recommendations