Testing closed source software: computer forensic tool case study

  • Manar Abu Talib
Original Paper


Computer forensic techniques are important for the prevention, detection, and investigation of electronic crime. Computer forensic investigators need computer forensic tools to produce reliable results that meet legal requirements and are acceptable in the courts. Most of these tools are closed-source, making the software a black-box for testing purposes. This paper illustrates a different black box testing method for experimenting computer forensic tools based on functional scenarios.


Closed source software testing Computer forensic tool testing Black box testing COSMIC-FFP ISO/IEC 19761 


  1. 1.
    Abran, A., Desharnais, J.-M., Olingy, S., St-Pierre, D., Synmons, C.: COSMIC FFP - Manuel de Mesures. Retrieved from (2009)
  2. 2.
    Abran, A., Ormandieva, O., Talib, M.: Functional Size and Information Theory-Based Functional Complexity Measures: Exploratory study of related concepts using COSMIC-FFP measurement method as a case study. In: 14th International Workshop of Software Measurement (IWSM-MetriKon 2004) (pp. 457–471). Konigs Wusterhausen: Shaker-Verlag (2004)Google Scholar
  3. 3.
    Abu Talib, M.: Towards early software reliability prediction for computer forensic tools (case study). SpringerPlus 5(1), 1–12 (2016)CrossRefGoogle Scholar
  4. 4.
    Abu Talib, M., Ormandjieva, O., Abran, A., Buglione, L., Khelfi, A.: Scenario-based Black Box Testing in COSMIC-FFP: a case study. Softw. Qual. Prof. 8(3), 22–33 (2006)Google Scholar
  5. 5.
    Abu Talib, M.: Exploratory Study on an Innovation Use of COSMIC-FFP for Early Quality Assessment. Ph.D. Thesis, Concordia University, Montreal, Canada (2007)Google Scholar
  6. 6.
    Abu Talib, M., Mendes, E., Khelifi, A.: Towards reliable web applications: ISO 19761. In: IECON 2012 - 38th Annual Conference on IEEE Industrial Electronics Society (2012)Google Scholar
  7. 7.
    Asthana, S., Tripathi, S., Singh, S.K.: A novel approach to generate test cases using class and sequence diagrams. In: International Conference on Contemporary Computing (pp. 155–167). Springer, Berlin (2010)Google Scholar
  8. 8.
    Ayers, D.: A second generation computer forensic analysis system. J. Digit. Invest. 6, 34–42 (2009)CrossRefGoogle Scholar
  9. 9.
    Bai, X., Peng, L., Li, H.: An approach to generate the thin-threads from the UML diagrams. In: Proceedings of the 28th Annual International Computer Software and Applications Conference. 1, pp. 546–552. Washington: IEEE Computer Society (2004)Google Scholar
  10. 10.
    Bai, X., Tsai, W., Paul, R., Feng, K., Yu, L.: Scenario-based modeling and its applications. In: Proceedings of the Seventh International Workshop on Object-Oriented Real-Time Dependable Systems, 2002. (WORDS 2002)., (pp. 253–260). San Diego (2002)Google Scholar
  11. 11.
    Beckett, J., Slay, J.: Digital forensics: validation and verification in a dynamic work environment. In: Proceedings of the 40th Hawaii International Conference on System Sciences (2007)Google Scholar
  12. 12.
    Beizer, B.: Software Testing Techniques, 2nd edn. Van Nostrand Reinhold, New York (1990)zbMATHGoogle Scholar
  13. 13.
    Bertolino, A.: Knowledge Area Description of Software Testing Guide to the SWEBOK. Retrieved December 25, 2015, from (2004)
  14. 14.
    Briand, L., Labiche, Y.: A UML-based approach to system testing. Softw. Syst. Model. 1(1), 10–42 (2002)CrossRefzbMATHGoogle Scholar
  15. 15.
    Carrier, B.: “Digital forensics tool testing images,” (2015). Accessed 28 Dec 2015
  16. 16.
    Chow, T.: Testing software design modeled by finite-state machines. IEEE Trans. Softw. Eng. SE–4(3), 178–187 (1978)CrossRefzbMATHGoogle Scholar
  17. 17.
    En-Nouaary, A., Dssouli, R., Khendek, F.: Timed Wp-method: testing real-time systems. IEEE Trans. Softw. Eng. 28(11), 1023–1038 (2002)CrossRefGoogle Scholar
  18. 18.
    Flandrin, F., Buchanan, W., Macfarlane, R., Ramsay, B., Smales, A.: Evaluating Digital Forensic Tools (DFTs), School of Computing, Edinburgh Napier University, Edinburgh (2012)Google Scholar
  19. 19.
    Fraikin, F., Leonhardt, T.: SeDiTeC-testing based on sequence diagrams. In: Automated Software Engineering, 2002. Proceedings. ASE 2002. 17th IEEE International Conference on (pp. 261–266) (2002)Google Scholar
  20. 20.
    Forensic Toolkit Imager User Guide. Retrieved December 25, 2015, from Access Data: (2012)
  21. 21.
    Garfinkel, S.: Digital forensics research: the next 10 years. Digit. Invest. 7, S64–S73 (2010)CrossRefGoogle Scholar
  22. 22.
    General Testing Methodology. Retrieved December 25, 2015, from NIST CFTT: (2007)
  23. 23.
    Guo, Y., Beckett, J., Slay, J.: Validation and verification of computer forensic software tools-Searching Function. In: Digital Forensic Research Workshop. Published by Elsevier Ltd (2009)Google Scholar
  24. 24.
    Kanellis, P., et al.: Digital Crime And Forensic Science in Cyberspace. ISBN-13: 978-1591408727 ISBN-10: 1591408725 (2006)Google Scholar
  25. 25.
    ISO/IEC 17025:2005. Retrieved December 25, 2015, from International Organization for Standardization: (2005)
  26. 26.
    ISO/IEC 19761. Retrieved December 25, 2015, from International Organization for Standardization: (2003)
  27. 27.
    Lyle, J.R., White, D.R., Ayers, R.P.: “Digital forensics at the national institute of standards and technology,” National Institute of Standards and Technology, Interagency Report (NISTIR), (2008)
  28. 28.
    Meyers, M., Rogers, M.: Computer forensics: the need for standardization and certification. Int. J. Digit. Evid. 3(2), 1 (2004)Google Scholar
  29. 29.
    Marchetti, E., Schilders, L., Winfield, S.: Scenario-Based testing applied in two real contexts: Healthcare and Employability. In: IEEE Fourth International Conference on Software Testing, Verification and Validation Workshops (ICSTW), 2011 (pp. 89–98) (2011)Google Scholar
  30. 30.
    National Academy of Sciences, Strengthening Forensic Science in the United States: A Path Forward. (2009)
  31. 31.
    NIST.: “General test methodology for computer forensic tools,” National Institute of Standards and Technology, Tech. Rep. Version 1.9. Retrieved December 25, 2015, from (2001)
  32. 32.
    NIST.: “Digital data acquisition tool test assertions and test plan,” National Institute of Standards and Technology, Draft 1. Retrieved December 25, 2015, from (2005)
  33. 33.
    NIST.: “Computer forensics tool testing program: Project overview,” Retrieved December 25, 2015, from (2015)
  34. 34.
    Pan, L., Batten, L.: A lower bound on effective performance testing for digital forensic tools. In: Proceedings of the Second International Workshop on Systematic Approaches to Digital Forensic Engineering (pp. 117–130). Washington: IEEE Computer Society (2007)Google Scholar
  35. 35.
    Panthi, V., Mohapatra, D.P.: Automatic test case generation using sequence diagram. In: Proceedings of International Conference on Advances in Computing (pp. 277–284). Springer India (2013)Google Scholar
  36. 36.
    Sarma, M., Kundu, D., Mall, R.: “Automatic test case generation from UML sequence diagram.” In: International Conference on Advanced Computing and Communications, 2007. ADCOM 2007. IEEE, (2007)Google Scholar
  37. 37.
    Zheng, M., Alagar, V., Ormandjieva, O.: Automated generation of test suites from formal specifications of real-time reactive systems. J. Syst. Softw. 81(2), 286–304 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag France SAS 2017

Authors and Affiliations

  1. 1.University of SharjahSharjahUnited Arab Emirates

Personalised recommendations