Skip to main content

Verification of the cryptlib Kernel

  • Chapter
  • 266 Accesses

5.5 Conclusion

This chapter has presented a new approach to building a trusted system, introducing the concept of an (obviously) trustworthy system rather than a trusted (because we say so) system. The verification methodology that is used to construct this system has been specially designed to instil confidence in users by allowing them to verify the design specification and implementation themselves through the use of “all the way down” verification. This is achieved through a combination of design-by-contract and specification-based testing, either with C assertions or ADL. Although this type of verification has long been classed as “beyond A1” (also known as “impossible at the current state of the art”), by carefully matching the verification methodology to the system design it is possible to perform this type of verification in this particular instance. Michael Jackson (the other one) has observed that “It’s a good rule of thumb that the value of a method is inversely proportional to its generality. A method for solving all problems can give you very little help with any particular problem” [140]. The method presented here has exactly the opposite properties. Far from trying to be a silver bullet, it constitutes a kryptonite bullet, one that is spectacularly effective against werewolves from Krypton, and not much good against any other kind. However, this doesn’t matter to us since all that’s important is that it’s the right tool for the job. Attacking a werewolf with a Swiss army chainsaw is no more useful. It just makes a bigger mess.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

5.6 References

  1. “On the Need for Practical Formal Methods”, Constance Heitmeyer, Proceedings of the 5thInternational Symposium on Formal Techniques in Real-Time and Real-Time Fault-Tolerant Systems (FTRTFT’98), Springer-Verlag Lecture Notes in Computer Science, No.1486, September 1998, p.18.

    Google Scholar 

  2. “A Controlled Experiment in Program Testing and Code Walkthroughs/Inspections”, Glenford Myers, Communications of the ACM, Vol.21,No.9 (September 1978), p.760.

    Article  Google Scholar 

  3. “Software Inspection and the Industrial Production of Software”, A.Frank Ackerman, Priscilla Fowler, and Robert Ebenau, Proceedings of the Symposium on Software Validation, Elsevier Science Publishers, 1984, p.13.

    Google Scholar 

  4. “Software Inspections: An Effective Verification Process”, A.Frank Ackerman, Lynne Buchwald, and Frank Lewski, IEEE Software, Vol.6,No.3 (May 1989), p.31.

    Article  Google Scholar 

  5. “Handbook of Walkthroughs, Inspections, and Technical Reviews”, Daniel Freedman and Gerald Weinberg, Dorset House, 1990.

    Google Scholar 

  6. “Practical Software Metrics for Project Management and Process Improvement”, Robert Grady, Prentice-Hall, 1992.

    Google Scholar 

  7. “Software Inspections: An Industry Best Practice”, David Wheeler, Bill Brykczynski, and Reginald Meeson Jr., IEEE Computer Society Press, 1996.

    Google Scholar 

  8. “Software Inspections and the Cost-Effective Production of Reliable Software”, A.Frank Ackerman, Software Engineering, IEEE Computer Society Press, 1997, p.235.

    Google Scholar 

  9. “National Software Quality Experiment: Results 1992–1996”, Don O’Neill, Proceedings of the 8thAnnual Software Technology Conference, April 1996.

    Google Scholar 

  10. “Analysis of a Kernel Verification”, Terry Vickers Benzel, Proceedings of the 1984 IEEE Symposium on Security and Privacy, IEEE Computer Society Press, August 1984, p.125.

    Google Scholar 

  11. “A Retrospective on the VAX VMM Security Kernel”, Paul Karger, Mary Ellen Zurko, Douglas Bonin, Andrew Mason, and Clifford Kahn, IEEE Transactions on Software Engineering, Vol.17,No.11 (November 1991), p.1147.

    Article  Google Scholar 

  12. “Building Reliable Secure Computing Systems out of Unreliable Insecure Components”, John Dobson and Brian Randell, Proceedings of the 1986 IEEE Symposium on Security and Privacy, IEEE Computer Society Press, August 1986, p.187.

    Google Scholar 

  13. “Fault Tolerance and Security”, Brian Randell, Dependable Computing and Fault-Tolerant Systems, Vol.9, Springer-Verlag, 1995, p.389.

    Google Scholar 

  14. “Building Reliable Secure Computing Systems out of Unreliable Insecure Components”, John Dobson and Brian Randell, Proceedings of the 17thAnnual Computer Security Applications Conference (ACSAC’01), December 2001, p.162 (this is an update/retrospective on [12])..

    Google Scholar 

  15. “Building Secure Software”, John Viega and Gary McGraw, Addison-Wesley, 2002.

    Google Scholar 

  16. “Open Source Security: Opportunity or Oxymoron”, George Lawton, IEEE Computer, Vol.35,No.3 (March 2002), p.18.

    Google Scholar 

  17. “The Performance of the N-Fold Requirement Inspection Method”, Eliezer Kantorowitz, Arie Guttman, and Lior Arzi, Requirements Engineering, Vol.2,No.3 (1997), p.152.

    Google Scholar 

  18. N-fold inspection: A requirements analysis technique”, Johnny Martin and Wei-Tek Tsai, Communications of the ACM, Vol.33,No.2 (February 1990), p.225.

    Article  Google Scholar 

  19. “An Experimental Study of Fault Detection in User Requirements Documents”, G.Michael Schneider, Johnny Martin, and W. Tsai, ACM Transactions on Software Engineering and Methodology, Vol.1,No.2 (April 1992), p.188.

    Article  Google Scholar 

  20. “Ensuring Software Integrity”, Jonathan Weiss and Edward Amoroso, Proceedings of the 4thAerospace Computer Security Applications, December 1988, p.323.

    Google Scholar 

  21. “Toward an Approach to Measuring Software Trust”, Ed Amoroso, Thu Nguyen, Jon Weiss, John Watson, Pete Lapiska, and Terry Starr, Proceedings of the 1991 IEEE Symposium on Security and Privacy, IEEE Computer Society Press, August 1991, p.198.

    Google Scholar 

  22. “Mondex Blows Users Anonymity”, Gavin Clarke and Madeleine Acey, Network Week, 25 October 1995.

    Google Scholar 

  23. “Mondex’s double life: E-Cash both ‘private’ and ‘fully auditable’”, Niall McKay, Infoworld Canada, 7 May 1997.

    Google Scholar 

  24. “The role of comprehension in software inspection”, A. Dunsmore, M. Roper, and M. Wood, The Journal of Systems and Software, Vol.52,No.2/3 (1 June 2000), p.121.

    Google Scholar 

  25. “The Evaluation of Three Specification and Verification Methodologies”, Richard Platek, Proceedings of the 4thSeminar on the DoD Computer Security sInitiative (later the National Computer Security Conference), August 1981, p.X-1.

    Google Scholar 

  26. “What non-programmers know about programming: Natural language procedure specification”, Kathleen Galotti and William Ganong III, International Journal of Man-Machine Studies, Vol.22,No.1 (January 1985), p.1.

    Google Scholar 

  27. “Estimating Understandability of Software Documents”, Kari Laitinen, ACM SIGSOFT Software Engineering Notes, Vol.21,No.4 (July 1996), p.81.

    Article  Google Scholar 

  28. “Issues in the Full Scale use of Formal Methods for Automated Testing”, J. Crowley, J. Leathrum, and K. Liburdy, Proceedings of the 1996 International Symposium on Software Testing and Analysis (ISSTA’96), ACM, January 1996, p.71.

    Google Scholar 

  29. “The PKI Specification Dilemma: A Formal Solution”, Maris Ozols, Marie Henderson, Chichang Liu, and Tony Cant, Proceedings of the 5thAustralasian Conference on Information Security and Privacy (ACISP’00), Springer-Verlag Lecture Notes in Computer Science No.1841, July 2000, p.206.

    Google Scholar 

  30. “A Translation Method from Natural Language Specifications into Formal Specifications Using Contextual Dependencies”, Yasunori Ishihara, Hiroyuki Seki, and Tadao Kasami, Proceedings of the IEEE International Symposium on Requirements Engineering, IEEE Computer Society Press, January 1993, p.232.

    Google Scholar 

  31. “Processing Natural Language Software Requirement Specifications”, Miles Osborne and Craig MacNish, Proceedings of the 2nd International Conference on Requirements Engineering (ICRE’96), IEEE Computer Society Press, April 1996, p.229.

    Google Scholar 

  32. “The Role of Natural Language in Requirements Engineering”, Kevin Ryan, Proceedings of the IEEE International Symposium on Requirements Engineering, IEEE Computer Society Press, January 1993, p.240.

    Google Scholar 

  33. “Lessons Learned in an Industrial Software Lab”, Albert Endres, IEEE Software, Vol.10,No.5 (September 1993), p.58.

    Article  Google Scholar 

  34. “Cognitive Fit: An Empirical Study of Information Acquisition”, Iris Vessey and Dennis Galletta, Information Systems Research, Vol.2,No.1 (March 1991), p.63.

    Google Scholar 

  35. “Cognitive Fit: An Empirical Study of Recursion and Iteration”, Atish Sinha and Iris Vessey, IEEE Transactions on Software Engineering, Vol.18,No.5 (May 1992), p.368.

    Article  Google Scholar 

  36. “On the Nature of Bias and Defects in the Software Specification Process”, Pablo Straub and Marvin Zelkowitz, Proceedings of the 16thInternational Computer Software and Applications Conference (COMPSAC’92), IEEE Computer Society Press, September 1992, p.17.

    Google Scholar 

  37. “An Empirical Investigation of the Effects of Formal Specifications on Program Diversity”, Thomas McVittie, John Kelly, and Wayne Yamamoto, Dependable Computing and Fault-Tolerant Systems, Vol.6, Springer-Verlag, 1992, p.219.

    Google Scholar 

  38. “Specifications are (preferably) executable”, Norbert Fuchs, Software Engineering Journal, Vol.7,No.5 (September 1992), p.323.

    Google Scholar 

  39. “From Formal Methods to Formally Based Methods: An Industrial Experience”, ACM Transactions on Software Engineering and Methodology, Vol.8,No.1 (January 1999), p.79.

    Google Scholar 

  40. “An Avenue for High Confidence Applications in the 21st Century”, Timothy Kremann, William Martin, and Frank Taylor, Proceedings of the 21st National Information Systems Security Conference (formerly the National Computer Security Conference), October 1999, CDROM distribution.

    Google Scholar 

  41. “Formal Methods and Testing: Why the State-of-the Art is Not the State-of-the Practice”, ISSTA’96/FMSP’96 panel summary, David Rosenblum, ACM SIGSOFT Software Engineering Notes, Vol.21,No.4 (July 1996), p.64.

    Article  Google Scholar 

  42. “A Case Study of SREM”, Paul Scheffer, Albert Stone, and William Rzepka, IEEE Computer, Vol.18,No.3 (April 1985), p.47.

    Google Scholar 

  43. “Seven Myths of Formal Methods”, Anthony Hall, IEEE Software, Vol.7,No.5 (September 1990), p.11.

    Article  Google Scholar 

  44. “Striving for Correctness”, Marshall Abrams and Marvin Zelkowitz, Computers and Security, Vol.14,No.8 (1995), p.719.

    Article  Google Scholar 

  45. “Symbol Security Condition Considered Harmful”, Marvin Schaefer, Proceedings of the 1989 IEEE Symposium on Security and Privacy, IEEE Computer Society Press, August 1989, p.20.

    Google Scholar 

  46. “A Method for Revalidating Modified Programs in the Maintenance Phase”, S. Yau and Z. Kishimoto, Proceedings of the 11thInternational Computer Software and Applications Conference (COMPSAC’87), October 1987, p.272.

    Google Scholar 

  47. “Techniques for Selective Revalidation”, Jean Hartman and David Robson, IEEE Software, Vol.7,No.1 (January 1990), p.31.

    Google Scholar 

  48. “A New Paradigm for Trusted Systems”, Dorothy Denning, Proceedings of the New Security Paradigms Workshop’ 92, 1992, p.36.

    Google Scholar 

  49. “Lesson from the Design of the Eiffel Libraries”, Bertrand Meyer, Communications of the ACM, Vol.33,No.9 (September 1990), p.69.

    Article  Google Scholar 

  50. “Applying ‘Design by Contract’”, Bertrand Meyer, IEEE Computer, Vol.25,No.10 (October 1992), p.40.

    Google Scholar 

  51. “iContract — The Java Design by Contract Tool”, Reto Kramer, Technology of Object-Oriented Languages and Systems, IEEE Computer Society Press, 1998, p.295.

    Google Scholar 

  52. “The Object Constraint Language”, Jos Warmer and Anneke Kleppe, Addison Wesley, 1999.

    Google Scholar 

  53. “Making Components Contract-aware”, Antoine Beugnard, Jean-Marc Jézéquel, Noël Plouzeau, and Damien Watkins, IEEE Computer, Vol.32,No.7 (July 1999), p.38.

    Google Scholar 

  54. “Object-oriented Software Construction”, Bertrand Meyer, Prentice Hall, 1988.

    Google Scholar 

  55. “Human memory: A proposed system and its control processes”, R. Atkinson and R. Shiffrin, The psychology of learning and motivation: Advances in research and theory, Vol.2, Academic Press, 1968, p.89.

    Google Scholar 

  56. “The control of short-term memory”, R. Atkinson and R. Shiffrin, Scientific American, No.225 (August 1971), p.82

    CAS  PubMed  Google Scholar 

  57. “Über das Gedächtnis”, Hermann Ebbinghaus, Duncker and Humblot, 1885.

    Google Scholar 

  58. “The magical number seven, plus or minus two: Some limits on our capacity for processing information”, George Miller, Psychological Review, Vol.63,No.2 (March 1956), p.81.

    CAS  PubMed  Google Scholar 

  59. “Human Memory: Structures and Processes”, Roberta Klatzky, W.H.Freeman and Company, 1980.

    Google Scholar 

  60. “Learning and Memory”, William Gordon, Brooks/Cole Publishing Company, 1989.

    Google Scholar 

  61. “Human Memory: Theory and Practice”, Alan Baddeley, Allyn and Bacon, 1998.

    Google Scholar 

  62. “Empirical Studies of Programming Knowledge”, Elliot Soloway and Kate Ehrlich, IEEE Transactions on Software Engineering, Vol.10,No.5 (September 1984), p.68.

    Google Scholar 

  63. “An integrating common framework for measuring cognitive software complexity”, Zsolt Öry, Software Engineering Journal, Vol.8,No.5 (September 1993), p.263.

    Google Scholar 

  64. “Software Psychology: Human Factors in Computers and Information Systems”, Ben Shneiderman, Winthrop Publishers Inc, 1980.

    Google Scholar 

  65. “A study in dimensions of psychological complexity of programs”, B. Chaudhury and H. Sahasrabuddhe, International Journal of Man-Machine Studies, Vol.23,No.2 (August 1985), p.113.

    Google Scholar 

  66. “Does OO Sync with How We Think?”, Les Hatton, IEEE Software, Vol.15,No.3 (May/June 1998), p.46.

    Article  Google Scholar 

  67. “Experimental assessment of the effect of inheritance on the maintainability of objectoriented systems”, R. Harrison, S. Counsell, and R. Nithi, The Journal of Systems and Software, Vol.52,No.2/3 (1 June 2000), p.173.

    Google Scholar 

  68. “Exploring the relationships between design measures and software quality in objectoriented systems”, Lionel Brand, Jürgen Wüst, John Daly, and D. Victor Porter, The Journal of Systems and Software, Vol.51,No.3 (1 May 2000), p.245.

    Google Scholar 

  69. “An Empirical Investigation of an Object-Oriented Software System”, Michelle Cartwright and Martin Shepperd, IEEE Transactions on Software Engineering, Vol.26,No.8 (August 2000), p.786.

    Article  Google Scholar 

  70. “A effect of semantic complexity on the comprehension of program modules”, Barbee Mynatt, International Journal of Man-Machine Studies, Vol.21,No.2 (August 1984), p.91.

    Google Scholar 

  71. “Program Comprehension Beyond the Line”, Scott Robertson, Erle Davis, Kyoko Okabe, and Douglas Fitz-Randolf, Proceedings of Human-Computer Interaction — INTERACT’90, Elsevier Science Publishers, 1990, p.959.

    Google Scholar 

  72. “Flow Diagrams, Turing Machines And Languages With Only Two Formation Rules”, Corrado Böhm and Guiseppe Jacopini, Communications of the ACM, Vol.9,No.5 (May 1966), p.336.

    Article  Google Scholar 

  73. “The Psychology of How Novices Learn Computer Programming”, Richard Mayer, Computing Surveys, Vol.13,No.1 (March 1981), p.121.

    Article  Google Scholar 

  74. “Towards a theory of the comprehension of computer programs”, Ruven Brooks, International Journal of Man-Machine Studies, Vol.8,No.6 (June 1983), p.543.

    Google Scholar 

  75. “Beacons in computer program comprehension”, Susan Weidenbeck, International Journal of Man-Machine Studies, Vol.25,No.6 (December 1986), p.697.

    Google Scholar 

  76. “Syntactic/Semantic Interactions in Programmer Behaviour: A Model and Experimental Results”, Ben Shneiderman and Richard Mayer, International Journal of Computer and Information Sciences, Vol.8,No.3 (June 1979), p.219.

    Article  Google Scholar 

  77. “Expertise in debugging computer programs: A process analysis”, Iris Vessey, International Journal of Man-Machine Studies, Vol.23,No.5 (November 1985), p.459.

    Google Scholar 

  78. “Knowledge and Process in the Comprehension of Computer Programs”, Elliott Soloway, Beth Adelson, and Kate Erhlich, The Nature of Expertise, Lawrence Erlbaum and Associates, 1988, p.129.

    Google Scholar 

  79. “Program Comprehension During Software Maintenance and Evolution”, Anneliese von Mayrhauser and A.Marie Vans, IEEE Computer, Vol.28,No.8 (August 1995), p.44.

    Google Scholar 

  80. “The Programmer’s Burden: Developing Expertise in Programming”, Robert Campbell, Norman Brown, and Lia DiBello, The Psychology of Expertise: Cognitive Research and Empirical AI, Springer-Verlag, 1992, p.269.

    Google Scholar 

  81. “Advanced Organisers in Computer Instruction Manuals: Are they Effective?”, Barbee Mynatt and Katherine Macfarlane, Proceedings of Human-Computer Interaction (INTERACT’87), Elsevier Science Publishers, 1987, p.917.

    Google Scholar 

  82. “Characteristics of the mental representations of novice and expert programmers: an empirical study”, Susan Weidenbeck and Vikki Fix, International Journal of Man-Machine Studies, Vol.39,No.5 (November 1993), p.793.

    Google Scholar 

  83. “Cognitive design elements to support the construction of a mental model during software exploration”, M.-A. Storey, F. Fracchia, and H. Müller, The Journal of Systems and Software, Vol.44,No.3 (January 1999), p.171.

    Article  Google Scholar 

  84. “Towards a theory of the cognitive processes in computer programming”, Ruven Brooks, International Journal of Man-Machine Studies, Vol.9,No.6 (November 1977), p.737.

    Google Scholar 

  85. “Change-Episodes in Coding: When and How Do Programmers Change Their Code?”, Wayne Gray and John Anderson, Empirical Studies of Programmers: Second Workshop, Ablex Publishing Corporation, 1987, p,185.

    Google Scholar 

  86. “Cognitive Processes in Software Design: Activities in the Early, Upstream Design”, Raymonde Guindon, Herb Krasner, and Bill Curtis, Proceedings of Human-Computer Interaction (INTERACT’87), Elsevier Science Publishers, 1987, p.383.

    Google Scholar 

  87. “A Model of Software Design”, Beth Adelson and Elliot Soloway, The Nature of Expertise, Lawrence Erlbaum and Associates, 1988, p.185.

    Google Scholar 

  88. “Novice-Export Differences in Software Design”, B. Adelson, D. Littman, K. Ehrlich, J. Black, and E. Soloway, Proceedings of Human-Computer Interaction (INTERACT’84), Elsevier Science Publishers, 1984, p.473.

    Google Scholar 

  89. “Stereotyped Program Debugging: An Aid for Novice Programmers”, Harald Wertz, International Journal of Man-Machine Studies, Vol.16,No.4 (May 1982), p.379.

    Google Scholar 

  90. “Expert Programmers Re-establish Intentions When Debugging Another Programmer’s Program”, Ray Waddington, Proceedings of Human-Computer Interaction (INTERACT’90), Elsevier Science Publishers, 1990, p.965.

    Google Scholar 

  91. “An Assertion Mapping Approach to Software Testing”, Greg Bullough, Jim Loomis, and Peter Weiss, Proceedings of the 13thNational Computer Security Conference, October 1990, p.266.

    Google Scholar 

  92. “Testing Object-Oriented Systems: Models, Patterns, and Tools”, Robert Binder, Addison-Wesley, 1999.

    Google Scholar 

  93. “Powerful Assertions in C++”, Harald Mueller, C/C++ User’s Journal, Vol.12,No.10 (October 1994), p.21.

    Google Scholar 

  94. “An Overview of Anna, a Specification Language for Ada”, David Luckham and Friedrich von Henke, IEEE Software, Vol.2,No.2 (March 1985), p.9.

    Google Scholar 

  95. “A methodology for formal specification and implementation of Ada packages using Anna”, Neel Madhav and Walter Mann, Proceedings of the 1990 Computer Software and Applications Conference, IEEE Computer Society Press, 1990, p.491.

    Google Scholar 

  96. “Programming with Specifications: An Introduction to ANNA, A Language for Specifying Ada Programs”, David Luckham, Texts and Monographs in Computer Science, Springer-Verlag, January 1991.

    Google Scholar 

  97. “Nana — GNU Project — Free Software Foundation (FSF)”, http://www.gnu.-org/software/nana/nana.html.

  98. “A Practical Approach to Programming With Assertions”, David Rosenblum, IEEE Transactions on Software Engineering, Vol.21,No.1 (January 1995), p.19.

    Article  Google Scholar 

  99. “The Behavior of C++ Classes”, Marshall Cline and Doug Lea, Proceedings of the Symposium on Object-Oriented Programming Emphasizing Practical Applications, ACM, September 1990.

    Google Scholar 

  100. “The New Hacker’s Dictionary (3rd Edition)”, Eric S. Raymond, MIT Press, 1996.

    Google Scholar 

  101. “The Algebraic Specification of Abstract Data Types”, John Guttag and James Horning, Acta Informatica, Vol.10 (1978), p.27.

    Article  MathSciNet  Google Scholar 

  102. “SELECT — A Formal System for Testing and Debugging Programs by Symbolic Execution”, Robert Boyer, Bernard Elspas, and Karl Levitt, ACM SIGPLAN Notices, Vol.10,No.6 (June 1975), p.234.

    Article  Google Scholar 

  103. “Data-Abstraction Implementation, Specification, and Testing”, John Gannon, Paul McMullin, and Richard Hamlet, ACM Transactions on Programming Languages and Systems, Vol.3,No.3 (July 1981), p.211.

    Article  Google Scholar 

  104. “Daistish: Systematic Algebraic Testing for 00 Programs in the Presence of Sideeffects”, Merlin Hughes and David Stotts, Proceedings of the 1996 International Symposium on Software Testing and Analysis (ISSTA’96), ACM, January 1996, p.53.

    Google Scholar 

  105. “The ASTOOT Approach to Testing Object-Oriented Programs”, Roong-Ko Doong and Phyllis Frankl, ACM Transactions on Software Engineering and Methodology, Vol.3,No.2 (April 1994), p.101.

    Article  Google Scholar 

  106. “Automating specification-based software testing”, Robert Poston, IEEE Computer Society Press, 1996.

    Google Scholar 

  107. “Specifications are not (necessarily) executable”, Ian Hayes and Cliff Jones, Software Engineering Journal, Vol.4,No.6 (November 1989), p.330.

    Google Scholar 

  108. “Executing formal specifications need not be harmful”, Andrew Gravell and Peter Henderson, Software Engineering Journal, Vol.11,No.2 (March 1996), p.104.

    Google Scholar 

  109. “Feasibility of Model Checking Software Requirements: A Case Study”, Tirumale Sreemani and Joanne Atlee, Proceedings of the 11thAnnual Conference on Computer Assurance (COMPASS’96), IEEE Computer Society Press, June 1996, p.77.

    Google Scholar 

  110. “A Practical Assessment of Formal Specification Approaches for Data Abstraction”, K. Ventouris and P. Pintelas, The Journal of Systems and Software, Vol.17,No.1 (January 1992), p.169.

    Google Scholar 

  111. “Languages for the Specification of Software”, Daniel Cooke, Ann Gates, Elif Demirörs, Onur Demirörs, Murat Tanik, and Bernd Krer, The Journal of Systems and Software, Vol.32,No.3 (March 1996), p.269.

    Article  Google Scholar 

  112. “Standard Reference Model for Computing System Engineering Tool Interconnections”, IEEE Standard 1175:1992, IEEE, 1992.

    Google Scholar 

  113. “Languages for Specification, Design, and Prototyping”, Valdis Berzins and Luqi, Handbook of Computer-Aided Software Engineering, Van Nostrand Reinhold, 1990, p.83.

    Google Scholar 

  114. “An Introduction to the Specification Language Spec”, Valdis Berzins and Luqi, IEEE Software, Vol.7,No.2 (March 1990), p.74.

    Article  Google Scholar 

  115. “Specifying Distributed Systems”, Butler Lampson, Working Material for the International Summer School on Constructive Methods in Computing Science, August 1988.

    Google Scholar 

  116. “A Tutorial on Larch and LCL, a Larch/C Interface Language”, John Guttag and James Horning, Proceedings of the 4thInternational Symposium of VDM Europe (VDM’91), Formal Software Development Methods, Springer-Verlag Lecture Notes in Computer Science, No.552, 1991, p1.

    Google Scholar 

  117. “Larch: Languages and Tools for Formal Specification”, John Guttag and James Horning, Springer-Verlag Texts and Monographs in Computer Science, 1993.

    Google Scholar 

  118. “Preliminary Design of ADL/C++ — A Specification Language for C++”, Sreenivasa Viswanadha and Sriram Sankar, Proceedings of the 2nd Conference on Object-Oriented Technology and Systems (COOTS’96), Usenix Association, June 1996, p.97.

    Google Scholar 

  119. “Specifying and Testing Software Components using ADL”, Sriram Sankar and Roger Hayes, Sun Microsystems Technical Report TR-94-23, Sun Microsystems Laboratories Inc, April 1994.

    Google Scholar 

  120. “Eiffel: The Language”, Bertrand Meyer, Prentice-Hall, 1991.

    Google Scholar 

  121. “Literate specifications”, C. Johnson, Software Engineering Journal, Vol.11,No.4 (July 1996), p.225.

    Google Scholar 

  122. “Increasing Assurance with Literate Programming Techniques”, Andrew Moore and Charles Payne Jr., Proceedings of the 11thAnnual Conference on Computer Assurance (COMPASS’96), IEEE Computer Society Press, June 1996, p.187.

    Google Scholar 

  123. “Fault Tolerance by Design Diversity: Concepts and Experiments”, Algirdas Avizienis and John Kelly, IEEE Computer, Vol.17,No.8 (August 1984), p.67.

    Google Scholar 

  124. “The N-version Approach to Fault Tolerant Systems”, Algirdas Avizienis, IEEE Transactions on Software Engineering, Vol.11,No.12 (December 1985), p.1491.

    Google Scholar 

  125. “The Use of Self Checks and Voting in Software Error Detection: An Empirical Study”, Nancy Leveson, Stephen Cha, John Knight, and Timothy Shimeall, IEEE Transactions on Software Engineering, Vol.16,No.4 (April 1990), p.432.

    Article  Google Scholar 

  126. “N-version design versus one good version”, Les Hatton, IEEE Software, Vol.14,No.6 (November/December 1997), p.71.

    Article  Google Scholar 

  127. “Automatically Checking an Implementation against Its Formal Specification”, Sergio Antoy and Dick Hamlet, IEEE Transactions on Software Engineering, Vol.26,No.1 (January 2000), p.55.

    Article  Google Scholar 

  128. “Checking the Play in Plug-and-Play”, Harry Goldstein, IEEE Spectrum, Vol.39,No.6 (June 2002), p.50.

    Article  Google Scholar 

  129. “On the Use of Executable Assertions in Structured Programs”, Ali Mili, Sihem Guemara, Ali Jaoua, and Paul Torrés, The Journal of Systems and Software, Vol.7,No.1 (March 1987), p.15.

    Article  Google Scholar 

  130. “A Method for Test Data Selection”, F. Velasco, The Journal of Systems and Software, Vol.7,No.2 (June 1987), p.89.

    Article  Google Scholar 

  131. “Approaches to Specification-Based Testing”, Debra Richardson, Owen O’Malley, and Cindy Tittle, Proceedings of the Third ACM SIGSOFT Symposium on Software Testing, Analysis and Verification, December 1989, p.86.

    Google Scholar 

  132. “Using Programmer-Written Compiler Extensions to Catch Security Holes”, Ken Ashcroft and Dawson Engler, Proceedings of the 2002 IEEE Symposium on Security and Privacy, May 2002, p.143

    Google Scholar 

  133. “Security Testing as an Assurance Mechanism”, Susan Walter, Proceedings of the 3rdAnnual Canadian Computer Security Symposium, May 1991, p.337.

    Google Scholar 

  134. “Predicate-Based Test Generation for Computer Programs”, Kuo-Chung Tai, Proceedings of the 15thInternational Conference on Software Engineering (ICSE’93), IEEE Computer Society/ACM Press, May 1993, p.267.

    Google Scholar 

  135. “Assertion-Oriented Automated Test Data Generation”, Bogdan Korel and Ali Al-Yami, Proceedings of the 18thInternational Conference on Software Engineering (ICSE’96), IEEE Computer Society Press, 1996, p.71

    Google Scholar 

  136. “Structural Specification-based Testing with ADL”, Juei Chang and Debra Richardson, Proceedings of the 1996 International Symposium on Software Testing and Analysis (ISSTA’96), January 1996, p.62.

    Google Scholar 

  137. “Structural Specification-Based Testing: Automated Support and Experimental Evaluation”, Juei Chang and Debra Richardson, Proceedings of the 7thEuropean Software Engineering Conference (ESE/FSE’99), Springer-Verlag Lecture Notes in Computer Science, No.1687, November 1999.

    Google Scholar 

  138. “Software Requirements and Specifications: A Lexicon of Practice, Principles, and Prejudices”, Michael Jackson, Addison-Wesley, 1995.

    Google Scholar 

Download references

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag New York, Inc.

About this chapter

Cite this chapter

(2004). Verification of the cryptlib Kernel. In: Cryptographic Security Architecture. Springer, New York, NY. https://doi.org/10.1007/0-387-21551-4_5

Download citation

  • DOI: https://doi.org/10.1007/0-387-21551-4_5

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-95387-8

  • Online ISBN: 978-0-387-21551-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics