A systematic approach to software safety integrity levels

  • Peter A. Lindsay
  • John A. McDermid
Conference paper

Abstract

International Standards for safety-critical software typically use notions of Safety Integrity Levels (SILs) which in our experience are difficult to apply and which lack credible assessment criteria. This paper proposes risk modelling as a basis for allocation of SILs to software and illustrates its use. It also proposes software-directed evaluation criteria for SILs. To assess what level of integrity is actually achieved. We contend that the approach leads to more credible results, and more cost-effective ways of delivering software safety assurance.

Keywords

Defend Dispatch Allo HAZOP 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. IEC. Functional safety: safety-related systems. Draft International Standard IEC 1508. June 1995.Google Scholar
  2. U.K. Ministry of Defence. Safety Management Requirements for Defence Systems Containing Programmable Electronics. Second Draft Defence Standard 00-56r August 1996.Google Scholar
  3. U.S. Dept of Defense. System safety program requirement. Military Standard MIL–STD 882C. January 1993.Google Scholar
  4. Radio Technical Commission for Aeronautics. Software considerations in airborne systems and equipment certification. RTCA D0178B. 1992.Google Scholar
  5. Railway Industry Association (U.K.). Safety related software for railway signalling. RIA Technical Specification No.23. 1991. Consultative Document.Google Scholar
  6. Motor Industry Software Research Association (U.K.). Development guidelines for vehicle based software. November 1994.Google Scholar
  7. B. Littlewood. M. Neil, and G. Ostrolenk. Uncertainty in softwareintensive systems. High Integrity Systems. 1(5):407–413. 1996.Google Scholar
  8. N.G. Leveson. Safeware: System Safety and Computers. Addison Wesley. Reading. Mass. 1995.Google Scholar
  9. NATO. Safety design requirements and guidelines for munition related safety critical computing systems. Standardization Agreement STANAG 4404.Google Scholar
  10. IEC. Risk Analysis of technological systems - Application guide. International Standard IEC 300–3 Part 9. 1995.Google Scholar
  11. E.J. Henley and H. Kumamoto. Probabilistic Risk Assessment>:Reliability Engineering, Design and Analysis. IEEE Press. 1992.Google Scholar
  12. N.G.Leveson. S.S.Cha. and T.J.Shimeall. Safety verification of Ada programs using software fault trees. IEEE Software. July:48–59. 1991.Google Scholar
  13. U.K. Ministry of Defence. A Guideline for HAZOP Studies on Systems which include a Programmable Electronic System. Draft Interim Defence Standard 00-58/1. March 1995.Google Scholar
  14. J. McDermid. Assurance in high-integrity software. In C.T. Sennett. editor. High-Integrity Software,chapter 10. Plenum Press. 1989.Google Scholar
  15. R.W. Butler and G.B. Finelli. The infeasibility of experimental quantification of life-critical software reliability. ACM SigSoft. 16(5). 1991.Google Scholar
  16. ISO. Quality management and assurance standards. Part 3: Guidelines for application of ISO 9001 to the development, supply and maintenance of software. International Standard ISO 9000-3.’1993.Google Scholar
  17. B. Carre. Program analysis and verification. In C.T. Sennett. editor.High-Integrity Software,chapter 8. Plenum Press. 1989.Google Scholar

Copyright information

© Springer-Verlag London Limited 1997

Authors and Affiliations

  • Peter A. Lindsay
    • 1
  • John A. McDermid
    • 2
  1. 1.Software Verification Research Centre, School of Information TechnologyThe Univeristy of QueenslandAustralia
  2. 2.Hig Integrity Systems Engineering Group, Department of Computer ScienceUniversity of YorkUK

Personalised recommendations