Security Sensitive Data Flow Coverage Criterion for Automatic Security Testing of Web Applications

  • Thanh Binh Dao
  • Etsuya Shibayama
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6542)


Common coverage criteria for software testing, such as branch coverage and statement coverage, are often used to evaluate the adequacy of test cases created by automatic security testing methods. However, these criteria were not originally defined for security testing. In this paper, we discuss the limitation of traditional criteria and present a study on a new criterion called security sensitive data flow coverage. This criterion aims to show how well test cases cover security sensitive data flows. We conducted an experiment of automatic security testing of real-world web applications to evaluate the effectiveness of our proposed coverage criterion, which is intended to guide test case generation. The experiment results show that security sensitive data flow coverage helps reduce test cost while keeping the effectiveness of vulnerability detection high.


automatic security testing web application coverage criteria data flow 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    The Open Web Application Security Project: Vulnerability Category,
  2. 2.
    The Open Web Application Security Project: SQL Injection Prevention Cheat Sheet,
  3. 3.
    Symantec Corporation: Five common Web application vulnerabilities,
  4. 4.
    Chinotec Technologies Company: Paros,
  5. 5.
    Acunetix Web Vulnerability Scanner (2008),
  6. 6.
    Jovanovic, N., Kruegel, C., Kirda, E.: Pixy: A Static Analysis Tool for Detecting Web Application Vulnerabilities (Short Paper). In: Proceedings of the 2006 IEEE Symposium on Security and Privacy (SP 2006), pp. 258–263. IEEE Computer Society, Washington, DC (2006)Google Scholar
  7. 7.
    Dao, T.-B., Shibayama, E.: Idea: Automatic Security Testing for Web Applications. In: Massacci, F., Redwine Jr., S.T., Zannone, N. (eds.) ESSoS 2009. LNCS, vol. 5429, pp. 180–184. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  8. 8.
    Zhao, R., Lyu, M.R.: Character String Predicate Based Automatic Software Test Data Generation. In: Proceedings of the Third International Conference on Quality Software (QSIC 2003), p. 255. IEEE Computer Society, Washington (2003)CrossRefGoogle Scholar
  9. 9.
    Nguyen-Tuong, A., Guarnieri, S., Greene, D., Shirley, J., Evans, D.: Automatically hardening web applications using precise tainting. In: Twentieth IFIP International Information Security Conference, SEC 2005 (2005)Google Scholar
  10. 10.
    Huang, Y.-W., Huang, S.-K., Lin, T.-P., Tsai, C.-H.: Web application security assessment by fault injection and behavior monitoring. In: Proceedings of the 12th International Conference on World Wide Web (WWW 2003), pp. 148–159. ACM, New York (2003)Google Scholar
  11. 11.
    Benjamin Livshits, V., Lam, M.S.: Finding security vulnerabilities in java applications with static analysis. In: Proceedings of the 14th Conference on USENIX Security Symposium (SSYM 2005), vol. 14, p. 18. USENIX Association, Berkeley (2005)Google Scholar
  12. 12.
    Smith, B., Shin, Y., Williams, L.: Proposing SQL statement coverage metrics. In: Proceedings of the Fourth International Workshop on Software Engineering For Secure Systems (SESS 2008), pp. 49–56. ACM, New York (2008)CrossRefGoogle Scholar
  13. 13.
    Halfond, W.G.J., Orso, A.: Command-Form Coverage for Testing Database Applications. In: Proceedings of the 21st IEEE/ACM International Conference on Automated Software Engineering (ASE 2006), pp. 69–80. IEEE Computer Society, Washington, DC (2006)CrossRefGoogle Scholar
  14. 14.
    Surez-Cabal, M.J., Tuya, J.: Using an SQL coverage measurement for testing database applications. In: Proceedings of the 12th ACM SIGSOFT Twelfth International Symposium on Foundations of Software Engineering (SIGSOFT 2004/FSE-12), pp. 253–262. ACM, New York (2004)CrossRefGoogle Scholar
  15. 15.
    Kapfhammer, G.M., Soffa, M.L.: A family of test adequacy criteria for database-driven applications. In: Proceedings of the 9th European Software Engineering Conference Held Jointly with 11th ACM SIGSOFT International Symposium on Foundations of Software Engineering (ESEC/FSE-11), pp. 98–107. ACM, New York (2003)Google Scholar
  16. 16.
    Kieyzun, A., Guo, P.J., Jayaraman, K., Ernst, M.D.: Automatic creation of SQL Injection and cross-site scripting attacks. In: Proceedings of the 31st International Conference on Software Engineering (ICSE 2009), pp. 199–209. IEEE Computer Society, Washington, DC (2009)Google Scholar
  17. 17.
    Zhu, H., Hall, P.A.V., May, J.H.R.: Software unit test coverage and adequacy. ACM Comput. Surv. 29(4), 366–427 (1997)CrossRefGoogle Scholar
  18. 18.
    Balzarotti, D., Cova, M., Felmetsger, V., Jovanovic, N., Kirda, E., Kruegel, C., Vigna, G.: Saner: Composing Static and Dynamic Analysis to Validate Sanitization in Web Applications. In: IEEE Security and Privacy Symposium (2008)Google Scholar
  19. 19.
    Cyber Security Bulletins, US-Cert,

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Thanh Binh Dao
    • 1
  • Etsuya Shibayama
    • 2
  1. 1.Dept. of Mathematical and Computing SciencesTokyo Institute of TechnologyMeguroJapan
  2. 2.Information Technology CenterThe University of TokyoBunkyo-kuJapan

Personalised recommendations