Advertisement

Measures of Quality for Model Validation

  • David J. Murray-Smith
Part of the Simulation Foundations, Methods and Applications book series (SFMA)

Abstract

Issues that are important in terms of model quality include the choice of model output variables and the methods and measures used to compare model and system performance. Methods of comparison may be graphical or may involve quantitative measures of a deterministic or statistical kind. Graphical approaches considered are based mainly on conventional time-history plots, but a brief introduction to box plots is also included. The possible strengths and weaknesses of graphical methods are discussed. Deterministic quantitative measures are based mainly on time-domain and frequency-domain comparisons between outputs measured from a real system and outputs predicted from a simulation model. Examples include simple squared-error measures, absolute-error measures and measures that provide convenient normalised values. Visualisation techniques are also discussed, with particular emphasis being placed on forms of polar diagram that have proved helpful in dealing with complex simulation models in several different application areas.

Keywords

Frequency Response Function Polar Diagram Control System Design Relative Error Measure Modal Assurance Criterion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Hughes IG, Hase TPA (2010) Measurements and their uncertainties: a practical guide to modern error analysis. Oxford University Press, OxfordGoogle Scholar
  2. 2.
    Marron JS, Tsybakerov AB (1995) Visual error criteria for qualitative smoothing. J Am Stat Assoc 90:499–507CrossRefMATHGoogle Scholar
  3. 3.
    Advisory Group for Aerospace Research and Development, North Atlantic Treaty Organisation (AGARD/NATO) in AGARD Advisory Report 280 ‘Rotorcraft System Identification’, September 1991Google Scholar
  4. 4.
    Tukey JW (1977) Exploratory data analysis. Addison Wesley, ReadingMATHGoogle Scholar
  5. 5.
    Knudsen M (2006) Experimental modelling of dynamic systems: an educational approach. IEEE Trans Educ 49(1):29–38MathSciNetCrossRefGoogle Scholar
  6. 6.
    Jachner S, van den Boogaart KG, Petzoldt T (2007) Statistical methods for the qualitative assessment of dynamic models with time delay (R Package qualV). J Stat Softw 22(8)Google Scholar
  7. 7.
    Bryce GW, Foord TR, Murray-Smith DJ et al (1976) Hybrid simulation of water-turbine governors. In: Crosbie, RE, Hay JL (eds) Towards Real-Time Simulation (Languages, models and systems) Part 1, Simulation Councils Proceedings Series, 6(1), Chapter 6, pp 35–44. Simulation Councils, La Jolla, CAGoogle Scholar
  8. 8.
    Heylen W, Lammens S (1996) FRAC: a consistent way of comparing frequency response functions. In: Proceedings of conference on identification in engineering systems, Swansea, pp 48–57Google Scholar
  9. 9.
    Allernang RJ (1980) Investigating some multiple input/output frequency response function experimental model analysis techniques. PhD dissertation, University of Cincinnati, USAGoogle Scholar
  10. 10.
    Grafe H (1998) Model updating of large structural dynamics models using measured response functions. PhD dissertation, Imperial College, LondonGoogle Scholar
  11. 11.
    Kershen G (2002) On the model validation in non-linear structural dynamics. Doctoral dissertation, University of Liege, BelgiumGoogle Scholar
  12. 12.
    White LB, Boashash B (1990) Cross-spectral analysis of non-stationary processes. IEEE Trans Inf Theory 36(4):830–835CrossRefMATHGoogle Scholar
  13. 13.
    Kammel G, Voigt HM, Neβ K (2005) Development of a tool to improve the forecast accuracy of dynamic simulation models for the paper process. In: Kappen J, Manninen J, Ritala R (eds) Proceedings of model validation workshop. VTT Technical Research Centre, Finland, EspooGoogle Scholar
  14. 14.
    Chatfield C (2003) The analysis of time series: an introduction, 6th edn. Chapman and Hall/CRC, Boca RatonGoogle Scholar
  15. 15.
    Brockwell PJ, Davis RA (2002) Introduction to time series and forecasting, 2nd edn. Springer, New YorkCrossRefMATHGoogle Scholar
  16. 16.
    Correa GO, Postlethwaite I (1986) Model uncertainty and linear system identification. In: Skelton RE, Owens DH (eds) Proceedings of the IFAC workshop on model error concepts and compensation, Boston, 1985, pp 115–120, Pergamon, OxfordGoogle Scholar
  17. 17.
    Smith MI, Murray-Smith DJ, Hickman D (2007) Mathematical and computer modeling of electro-optic systems using a generic modeling approach. J Def Model Simul 4(1):3–16Google Scholar
  18. 18.
    Smith MI, Murray-Smith DJ, Hickman D (2007) Verification and validation issues in a generic model of electro-optic sensor systems. J Def Model Simul 4(1):17–17Google Scholar
  19. 19.
    Grant A, Murray-Smith DJ (204) Polygon imaging methods in plant monitoring and model validation applications. In: Proceedings Advanced Engineering Design 2004 Conference, Glasgow, September 2004, Paper C1.07. Orgit, Prague, Czech RepublicGoogle Scholar
  20. 20.
    Grant AGN (2006) Data to polygon transformation and comparison: a technique for model validation and related applications. PhD dissertation, University of Glasgow, UKGoogle Scholar
  21. 21.
    Murray-Smith DJ (2012) Modelling and simulation of integrated systems in engineering: issues of methodology, quality, testing and application. Woodhead Publishing, CambridgeCrossRefGoogle Scholar
  22. 22.
    Cloud DJ, Rainey LB (eds) (1998) Applied modeling and simulation: an integrated approach to development and operation. McGraw-Hill, New YorkGoogle Scholar
  23. 23.
    Zeigler BP, Praehofer H, Lim TG (2000) Theory of modeling and simulation, 2nd edn. Academic, San DiegoGoogle Scholar
  24. 24.
    Brade D, Köster, A (2001) Risk-based validation and verification levels definition. In: Proceedings of the European simulation interoperability workshop, London, UK2001, Simulation Interoperability Standardization OrganizationGoogle Scholar
  25. 25.
    Brade D, Maguire R, Lotz H-B (2002) Arguments-based credibility levels. In: Proceedings of the SISO spring simulation interoperability workshop 2002, OrlandoGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • David J. Murray-Smith
    • 1
  1. 1.School of EngineeringUniversity of GlasgowGlasgowUK

Personalised recommendations