Advertisement

Annals of Software Engineering

, Volume 5, Issue 1, pp 279–292 | Cite as

A framework for performing verification and validation in reuse-;based software engineering

  • Edward A. Addy
Article

Abstract

Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-;critical and mission-;critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V; from an individual application system to a product line of systems that are developed within an architecture-;based software engineering environment. This framework includes the activities of traditional application-;level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-;cycle phases, and provides motivation for the activities.

Keywords

Domain Model Domain Architecture Domain Engineering Reusable Component Domain Product 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Addy, E.A. (1996), “V&V within Reuse-Based Software Engineering,” In Proceedings of the Fifth Annual Workshop on Software Reuse Education and Training, Reuse '96, http://www.asset.com/ WSRD/conferences/proceedings/results/addy/addy.html.Google Scholar
  2. Armitage, J.W. (1993), “Process Guide for the DSSA Process Life Cycle,” DSSA-PG-001, Software Engineering Institute Software Process Definition Project, Pittsburgh, PA.Google Scholar
  3. Duke, E.L. (1989), “V&V of Flight and Mission-Critical Software,” IEEE Software 6, 3, 39–45.CrossRefGoogle Scholar
  4. Dunn, M.F. and J.C. Knight (1993), “Certification of Reusable Software Parts,” Technical Report CS-93-41, Department of Computer Science, University of Virginia, Charlottesville, VA.Google Scholar
  5. Edwards, S.H. and B.W. Wiede (1997), “WISR8: 8th Annual Workshop on SW Reuse,” Software Engineering Notes 22, 5, 17–32.CrossRefGoogle Scholar
  6. Garlan, D. (1995), “First International Workshop on Architectures for Software Systems Workshop Summary,” Software Engineering Notes 20, 3, 84–89.CrossRefGoogle Scholar
  7. Goguen, J.A. (1996), “Parameterized Programming and Software Architecture,” In Proceedings of the Fourth International Conference on Software Reuse, IEEE Computer Society Press, Los Alamitos, CA, pp. 2–10.CrossRefGoogle Scholar
  8. Katz, S., C. Dabrowski, K. Miles and M. Law (1994), “Glossary of Software Reuse Terms,” NIST Special Publication 500-222, Computer Systems Laboratory, National Institute of Standards and Technology, Gaithersburg, MD.Google Scholar
  9. IEEE (1990), STD 610.12-1990, IEEE Standard Glossary of Software Engineering Technology, Institute of Electrical and Electronics Engineers, Inc., New York, NY.Google Scholar
  10. IEEE (1992), STD 1012-1986 (R 1992), IEEE Standard for Software Verification and Validation Plans, Institute of Electrical and Electronics Engineers, Inc., New York, NY.Google Scholar
  11. IEEE (1993), STD 1059-1993, IEEE Guide for Software Verification and Validation Plans, Institute of Electrical and Electronics, Inc., New York, NY.Google Scholar
  12. Kazman, R., G. Abowd, L. Bass and P. Clements, “Scenario-Based Analysis of Software Architecture,” IEEE Software 13, 6, 47–55.Google Scholar
  13. Lewis, R.O. (1992), Independent Verification and Validation, A Life Cycle Engineering Process for Quality Software, Wiley, New York, NY.Google Scholar
  14. Makowsky, L.C. (1992), “A Guide to Independent Verification and Validation of Computer Software,” USA-BRDEC-TR//2516, United States Army Belvoir Research, Development and Engineering Center, Fort Belvoir, VA.Google Scholar
  15. Poore, J.H., T. Pepin, M. Sitaraman and F.L. Van Scoy (1992), “Criteria and Implementation Procedures for Evaluating Reusable Software Engineering Assets,” DTIC AD-B166803, prepared for IBM Corportation Federal Sectors Division, Gaithersburg, MD.Google Scholar
  16. Software Productivity Solutions, Inc. (1996), “Certification of Reusable Software Components, Volume 2 – Certification Framework,” prepared for Rome Laboratory/C3CB, Griffiss AFB, NY.Google Scholar
  17. Tracz, W. (1996), “Test and Analysis of Software Architectures,” In Proceedings of the International Symposium on Software Testing and Analysis (ISSTA '96), ACM Press, New York, NY, pp. 1–3.CrossRefGoogle Scholar
  18. Unisys, Valley Forge Engineering Center and EWA, Inc. (1994), “Component Provider's and Tool Developer's Handbook,” STARS-VC-B017/001/00, prepared for Electronic Systems Center, Air Force Material Command, USAF, Hanscom AFB, MA.Google Scholar
  19. Wallace, D.R. and R.U. Fujii (1989a), “Software Verification and Validation: Its Role in Computer Assurance and Its Relationship with Software Project Management Standards,” NIST Special Publication 500-165, National Institute of Standards and Technology, Gaithersburg, MD.Google Scholar
  20. Wallace, D.R. and R.U. Fujii (1989b), “Software Verification and Validation: An Overview,” IEEE Software 6, 3, 10–17.CrossRefGoogle Scholar

Copyright information

© Kluwer Academic Publishers 1998

Authors and Affiliations

  • Edward A. Addy
    • 1
  1. 1.NASA/WVU Software Research Laboratory, NASA/WVU Software IV&V FacilityFairmontUSA

Personalised recommendations