Advertisement

Journal of Network and Systems Management

, Volume 19, Issue 3, pp 343–366 | Cite as

Objective Risk Evaluation for Automated Security Management

  • Mohammad Salim Ahmed
  • Ehab Al-Shaer
  • Mohamed Taibah
  • Latifur Khan
Article

Abstract

Network security depends on a number of factors. And a common characteristic of these factors is that they are dynamic in nature. Such factors include new vulnerabilities and threats, the network policy structure and traffic. These factors can be divided into two broad categories. Network risk and service risk. As the name implies, the former one corresponds to risk associated with the network policy whereas the later one depends on the services and software running on the system. Therefore, evaluating security from both the service and policy perspective can allow the management system to make decisions regarding how a system should be changed to enhance security as par the management objective. Such decision making includes choosing between alternative security architectures, designing security countermeasures, and to systematically modify security configurations to improve security. As there may be real time changes to the network threat, this evaluation must be done dynamically to handle such changes. In this paper, we provide a security metric framework that quantifies objectively the most significant security risk factors, which include existing vulnerabilities, historical trend of vulnerabilities of the remotely accessible services, prediction of potential vulnerabilities for these services and their estimated severity, unused address space and finally propagation of an attack within the network. These factors cover both the service aspect and the network aspect of risk toward a system. We have implemented this framework as a user-friendly tool called Risk based prOactive seCurity cOnfiguration maNAger (ROCONA) and showed how this tool simplifies security configuration management of services and policies in a system using risk measurement and mitigation. We also combine all the components into one single metric and present validation experiments using real-life vulnerability data from National Vulnerability Database (NVD) and show comparison with two existing risk measurement tools.

Keywords

Security evaluation Risk prediction Vulnerability measure Attack propagation Attack immunity Quality of protection 

Notes

Acknowledgments

The authors would like to thank Muhammad Abedin and Syeda Nessa of The University of Texas at Dallas for their help with the formalization and experiments making this work possible.

References

  1. 1.
    Alhazmi, O.H., Malaiya Y.K.: Prediction capabilities of vulnerability discovery models. In: Proceedings of reliability and maintainability symposium, Jan 2006, pp. 86–91Google Scholar
  2. 2.
    National institute of science and technology (nist), http://nvd.nist.gov
  3. 3.
    Lee, S.C., Davis, L.B.: Learning from experience: operating system vulnerability trends, IT Professional, 5(1), Jan/Feb 2003Google Scholar
  4. 4.
    Abedin, M., Nessa, S., Al-Shaer, E., Khan, L.: Vulnerability analysis for evaluating quality of protection of security policies, In: 2nd ACM CCS workshop on quality of protection, Alexandria, Virginia, Oct 2006Google Scholar
  5. 5.
    Bock, F.: An algorithm to construct a minimum directed spanning tree in a directed network, In: Developments in Operations Research. Gordon and Breach, pp. 29–44 (1971)Google Scholar
  6. 6.
    Al-Shaer, E., Hamed, H.: Discovery of policy anomalies in distributed firewalls. In: Proceedings of IEEE INFOCOM’04, March 2004Google Scholar
  7. 7.
    Hamed, H., Al-Shaer, E., Marrero, W.: Modeling and verification of ipsec and vpn security policies. In Proceedings of IEEE ICNP’2005, Nov 2005Google Scholar
  8. 8.
    Aol software to improve pc security, http://www.timewarner.com/corp/newsroom/pr/0,20812,1201969,00.html
  9. 9.
    Schiffman, M.: A complete guide to the common vulnerability scoring system (cvss). http://www.first.org/cvss/cvss-guide.html, June 2005
  10. 10.
    Rogers, R., Fuller, E., Miles, G., Hoagberg, M., Schack, T. Dykstra, T., Cunningham, B.: Network Security Evaluation Using the NSA IEM, 1st ed. Syngress Publishing, Inc., Aug 2005Google Scholar
  11. 11.
    Swanson, M., Bartol, N., Sabato, J., Hash, J., Graffo, L.: Security Metrics Guide for Information Technology Systems. National Institute of Standards and Technology, Gaithersburg, MD 20899-8933, July 2003Google Scholar
  12. 12.
    ”10 network security assessment tools you can’t live without” http://www.windowsitpro.com/Article/ArticleID/47648/47648.html?Ad=1
  13. 13.
    Kamara, S., Fahmy, S., Schultz, E., Kerschbaum, F., Frantzen, M.: Analysis of vulnerabilities in internet firewalls. Comput. Secur. 22(3), 214–232 (2003)CrossRefGoogle Scholar
  14. 14.
    Ammann, P., Wijesekera, D., Kaushik, S.: Scalable, graph-based network vulnerability analysis. In: CCS ’02: Proceedings of the 9th ACM conference on computer and communications security, pp. 217–224, ACM Press, New York, NY, USA (2002)Google Scholar
  15. 15.
    Phillips, C., Swiler, L.P.: A graph-based system for network-vulnerability analysis. In: NSPW ’98: Proceedings of the 1998 workshop on new security paradigms, pp. 71–79, ACM Press, New York, NY, USA (1998)Google Scholar
  16. 16.
    Feng, C., Jin-Shu, S.: A flexible approach to measuring network security using attack graphs. In: International symposium on electronic commerce and security, April 2008Google Scholar
  17. 17.
    Mehta, C.B.V., Zhu, H., Clarke, E., Wing, J.: Ranking attack graphs. In: Recent Advances in Intrusion Detection 2006, Hamburg, Germany, Sept 2006Google Scholar
  18. 18.
    Manadhata, P. Wing, J.: An attack surface metric. In: First Workshop on Security Metrics, Vancouver, BC, August 2006Google Scholar
  19. 19.
    Howard, M., Pincus, J., Wing, J.M.: Measuring relative attack surfaces. In: Workshop on Advanced Developments in Software and Systems Security, Taipei, Dec 2003Google Scholar
  20. 20.
    Atzeni, A., Lioy, A., Tamburino, L.: A generic overall framework for network security evaluation. In: Congresso Annuale AICA 2005, Oct 2005, pp. 605–615Google Scholar
  21. 21.
    Atzeni, A., Lioy, A.: Why to adopt a security metric? A little survey. In: QoP-2005: Quality of protection workshop. Sept 2005Google Scholar
  22. 22.
    Pamula, J., Ammann, P., Jajodia, S., Swarup, V.: A weakest-adversary security metric for network configuration security analysis. In: ACM 2nd workshop on quality of protection 2006, Alexandria, VA, Oct 2006Google Scholar
  23. 23.
    Alhazmi, O.H., Malaiya, Y.K.: Modeling the vulnerability discovery process. In: Proceedings of international symposium on software reliability engineering, Nov 2005Google Scholar
  24. 24.
    Wang, L., Noel, S., Jajodia, S.: Minimum-cost network hardening using attack graphs. In: Computer Communications. Alexandria, VA, Nov 2006Google Scholar
  25. 25.
    Wang, L., Liu, A., Jajodia, S.: Using attack graphs for correlating, hypothesizing, and predicting intrusion alerts. In: Computer Communications, Sept 2006Google Scholar
  26. 26.
    Noel, S., Jajodia, S., O’Berry, B., Jacobs, M.: Efficient minimum-cost network hardening via exploit dependency graphs. In: 19th annual computer security applications conference, Las Vegas, Nevada, Dec 2003Google Scholar
  27. 27.
    Sahinoglu, M.: Security meter: a practical decision-tree model to quantify risk. In: IEEE Security and Privacy, June 2005Google Scholar
  28. 28.
    Sahinoglu, M.: Quantitative risk assessment for dependent vulnerabilities. In: Reliability and maintainability symposium, June 2005Google Scholar
  29. 29.
    Ahmed, M.S., Al-Shaer, E., Khan, L.: A novel quantitative approach for measuring network security. In: INFOCOM’08, April 2008Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Mohammad Salim Ahmed
    • 1
  • Ehab Al-Shaer
    • 2
  • Mohamed Taibah
    • 3
  • Latifur Khan
    • 1
  1. 1.University of Texas at DallasRichardsonUSA
  2. 2.University of North Carolina CharlotteCharlotteUSA
  3. 3.DePaul UniversityChicagoUSA

Personalised recommendations