A Systems Approach for Evaluating a Warhead Monitoring System

  • Cliff ChenEmail author
  • Sharon DeLand
  • Thomas Edmunds


Future agreements that limit and reduce total warhead stockpiles may require monitoring nuclear warheads, potentially necessitating new monitoring approaches, technologies, and procedures. A systems approach for evaluating a warhead monitoring system can help ensure that it is evaluated against all requirements, and that the evaluation is objective, standardized, transparent, and reproducible (not analyst dependent). All engineered systems are designed to meet a set of requirements based on inputs and assumptions about the environment and processes. For arms control monitoring, identifying requirements is challenging because treaty objectives are often strategic or political, and must be translated into technical monitoring objectives that are agreed upon by multiple stakeholders. The monitoring objectives are the foundation of the evaluation framework, which also includes evaluation scenarios that reflect strategic concerns, performance metrics based on detection goals, and a functional architecture of how objectives are achieved by system components. This framework embodies the system requirements against which the monitoring system can be evaluated. Computational simulations, when validated by experimental activities and field exercises, can help characterize monitoring system performance. One such class of simulations is the discrete-event simulation, which can cohesively model weapons enterprise processes and monitoring and inspection activities. In conjunction with analysis algorithms that correlate inspection outcomes with declarations and other information streams, these simulation results can be used to quantify the confidence with which a monitoring system can detect and differentiate scenarios. The capability to evaluate the effectiveness of monitoring options and explore tradeoffs is essential in supporting technical design activities, guiding future R&D investments and supporting future treaty negotiations.



The authors acknowledge the contributions of Crystal Dale for her contribution to the functional decomposition and general conceptual discussions. The authors also acknowledge Douglas Keating, Robert Brigantic, Angela Waterworth, Casey Perkins, and Matthew Oster for their work on the development of the discrete-event simulation.


  1. 1.
    Chen CD, Dale CB, DeLand SM, Waterworth A, Edmunds TA, Keating D, Oster M (2016) Developing a System Evaluation Methodology for a Warhead Monitoring System. In: Proceedings of the 57th Annual Meeting of the Institute for Nuclear Material Management, AtlantaGoogle Scholar
  2. 2.
    Defense Acquisition University Press (2001) Systems Engineering Fundamentals, Fort Belvoir, VAGoogle Scholar
  3. 3.
    Joint Chiefs of Staff, J-8 (2016) (2009) Capabilities-Based Assessment (CBA) User’s Guide, Force Structure, Resources, and Assessments Directorate,Version 3. Accessed 28 Feb 2019
  4. 4.
    Ni K, Faissol D, Edmunds TA, Wheeler R (2013) Exploitation of Ambiguous Cues to Infer Terrorist Activity. Decision Analysis 10(1):42–62CrossRefGoogle Scholar
  5. 5.
    Nitze, PH (1985) On the Road to a More Stable Peace. Bureau of Public Affairs, Department of State, Current Policy No. 657Google Scholar
  6. 6.
    Avenhaus R, Canty MJ, Calogero F (2005) Compliance Quantified: An Introduction to Data Verification. Cambridge University Press, New YorkGoogle Scholar
  7. 7.
    Edmunds TA, Chen CD (2016) Statistical Sampling Methods for Treaty Verification in Dynamic Environments. In: Proceedings of the 57th Annual Meeting of the Institute for Nuclear Material Management, Atlanta, 2016Google Scholar
  8. 8.
    Sentz K, Hemez F (2014) The Future of Intelligent Systems for Safeguards, Nonproliferation, and Arms Control Verification. In: Proceedings of the Information Analysis Technologies, Techniques, and Methods for Safeguards, Nonproliferation and Arms Control Verification Workshop, PortlandGoogle Scholar
  9. 9.
    Saaty TL (1990) How to make a decision: The Analytic Hierarchy Process. European Journal of Operational Research 48:9–26CrossRefGoogle Scholar
  10. 10.
    X-5 Monte Carlo Team (2008) MCNP – A General N-Particle Transport Code, Version 5, Los Alamos Report LA-UR-03-1987, Los Alamos, NMGoogle Scholar
  11. 11.
    Gottlieb E, Harrigan R, McDonald M, Oppel F, Xavier P (2001) The Umbra Simulation Framework, Sandia Report SAND2001-1533, Albuquerque, NMGoogle Scholar
  12. 12.
    Perkins C, Brigantic RB, Keating D, Liles K, Meyer N, Oster M, Waterworth A (2015) Using simulation to evaluate warhead monitoring system effectiveness. In: Proceedings of the 56th Annual Meeting of the Institute for Nuclear Materials Management, Indian WellsGoogle Scholar
  13. 13.
    Oster M, Waterworth A, Keating D, Dale CB, DeLand SM (2016) Using Simulation to Support Monitoring System Design and Evaluation. In: Proceedings of the 57th Annual Meeting of the Institute for Nuclear Material Management, AtlantaGoogle Scholar
  14. 14.
    The Treaty Between the United States of America and the Russian Federation on Measures for the Further Reduction and Limitation of Strategic Offensive Arms, Signed 8 Apr 2010Google Scholar

Copyright information

© This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2020

Authors and Affiliations

  1. 1.Lawrence Livermore National LaboratoryLivermoreUSA
  2. 2.Sandia National LaboratoriesAlbuquerqueUSA

Personalised recommendations