Advertisement

Diversity in Open Source Intrusion Detection Systems

  • Hafizul Asad
  • Ilir GashiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11093)

Abstract

We present an analysis of the diversity that exists in the rules and blacklisted IP addresses of the Snort and Suricata Intrusion Detection Systems (IDSs). We analysed the evolution of the rulesets and blacklisted IP addresses of these two IDSs over a 5-month period between May and October 2017. We used three different off-the-shelf default configurations of the Snort IDS and the Emerging Threats (ET) configuration of the Suricata IDS. Analysing the differences in these systems allows us to get insights on where the diversity in the behaviour of these systems comes from and how does it evolve over time. This gives insight to Security architects on how they can combine and layer these systems in a defence-in-depth deployment. To the best of our knowledge a similar experiment has not been performed before. We will also show results on the observed diversity in behaviour of these systems, when they analysed the network data of the DMZ network of City, University of London.

Keywords

Security assessment Security tools Intrusion detection systems Design diversity 

Notes

Acknowledgment

This work was supported by the UK EPSRC project D3S and in part by the EU H2020 framework DiSIEM project.

References

  1. 1.
    Elia, I.A., Fonseca, J., Vieira, M.: Comparing SQL injection detection tools using attack injection: an experimental study. In: 2010 IEEE 21st International Symposium on Software Reliability Engineering (2010)Google Scholar
  2. 2.
    Littlewood, B., Strigini, L.: Redundancy and diversity in security. In: Samarati, P., Ryan, P., Gollmann, D., Molva, R. (eds.) ESORICS 2004. LNCS, vol. 3193, pp. 423–438. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-30108-0_26CrossRefGoogle Scholar
  3. 3.
    Garcia, M., et al.: Analysis of operating system diversity for intrusion tolerance. Softw.: Pract. Exp. 44(6), 735–770 (2014)Google Scholar
  4. 4.
    Hiltunen, M.A., et al.: Survivability through customization and adaptability: the Cactus approach. In: Proceedings of DARPA Information Survivability Conference and Exposition, DISCEX 2000 (2000)Google Scholar
  5. 5.
    Reynolds, J., et al.: The design and implementation of an intrusion tolerant system. In: International Conference on Dependable Systems and Networks, DSN 2002, Washington, D.C., USA (2002)Google Scholar
  6. 6.
    Sanders, W.H., et al.: Probabilistic validation of intrusion tolerance. In: International Conference on Dependable Systems and Networks, Fast Abstracts Supplement, DSN 2002, Bethesda, Maryland (2002)Google Scholar
  7. 7.
    Gupta, V., Lam, V., Ramasamy, H.V., Sanders, W.H., Singh, S.: Dependability and performance evaluation of intrusion-tolerant server architectures. In: de Lemos, R., Weber, T.S., Camargo, J.B. (eds.) LADC 2003. LNCS, vol. 2847, pp. 81–101. Springer, Heidelberg (2003).  https://doi.org/10.1007/978-3-540-45214-0_9CrossRefGoogle Scholar
  8. 8.
    Bishop, et al.: Diversity for security: a study with off-the-shelf antivirus engines. In: 22nd IEEE International Symposium on Software Reliability Engineering (ISSRE 2011) (2011)Google Scholar
  9. 9.
    Milenkoski, A., et al.: Evaluating computer intrusion detection systems: a survey of common practices. ACM Comput. Surv. 48(1), 12:1–12:41 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Centre for Software ReliabilityCity, University of LondonLondonUK

Personalised recommendations