Software Support for Teaching and Measuring Cognitive Readiness



Cognitive readiness to perform complex decision-making or problem-solving tasks can be, and often is, taught and measured in the context of interactive practice environments, including computer games and simulations. Traditionally, human experts have observed learners in such practice environments and have made assessments of cognitive readiness for the task by evaluating the performance. Such experts sometimes also instructed the learners, either in real time or in after-action reviews that sometimes included playback of a recorded practice session. Progress has been made toward supporting automated assessment in simulations and games and in providing automated instruction. The TAO Sandbox, a practice environment for planning surface warfare naval tactics, has such features that support automated assessment. Elements of that system suggest universal services that games and simulations could offer to support automated instruction and assessment of cognitive readiness.


Mental Simulation Emergent Event Threat Level Surface Ship Active Sonar 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The work reported herein was partially supported by a grant from the Office of Naval Research, Award Number N00014-08-1-0126. The findings and opinions expressed here do not necessarily reflect the positions or policies of the Office of Naval Research.


  1. Auslander, B., Molineaux, M., Aha, D. W., Munro, A., & Pizzini, Q. (2009). Towards research on goal reasoning with the TAO Sandbox (Technical Note AIC-09-155). Washington, DC: Naval Research Laboratory, Navy Center for Applied Research on Artificial Intelligence.Google Scholar
  2. Clark, R. E., Yates, K., Early, S., & Moulton, K. (2010). An analysis of the failure of electronic media and discovery-based learning: Evidence for the performance benefits of guided training methods [electronic version]. In K. H. Silber & R. Foshay (Eds.), Handbook of training and improving workplace performance, Volume I: Instructional design and training delivery. Washington, DC: International Society for Performance Improvement.Google Scholar
  3. Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37, 32–64.CrossRefGoogle Scholar
  4. Mayer, R. (2004). Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction. The American Psychologist, 59, 14–19.CrossRefGoogle Scholar
  5. Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 42–59.CrossRefGoogle Scholar
  6. Munro, A. (2007). Foundations for software support of instruction in game contexts. In H. F. O’Neil & R. S. Perez (Eds.), Computer games and team and individual learning. Amsterdam: Elsevier.Google Scholar
  7. Munro, A., & Pizzini, Q. A. (2009). The TAO Sandbox user guide. Working paper, Center for Cognitive Technology, University of Southern California.Google Scholar
  8. Munro, A., Pizzini, Q. A., & Bewley, W. (2009). Learning anti-submarine warfare in the context of a game-like tactical planner. In Proceedings of the Interservice/Industry Training, Simulation & Education Conference (I/ITSEC). Retrieved from
  9. Paas, F., Renkl, A., & Sweller, J. (2004). Cognitive load theory: Instructional implications of the interaction between information structures and cognitive architecture. Instructional Science, 32, 1–8. Retrieved from
  10. Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59–89.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Center for Cognitive Technology, Rossier School of EducationUniversity of Southern CaliforniaRedondo BeachUSA

Personalised recommendations