sIDMG: Small-Size Intrusion Detection Model Generation of Complimenting Decision Tree Classification Algorithm

  • Seung-Hyun Paek
  • Yoon-Keun Oh
  • Do-Hoon Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4298)


Most of researches for intrusion detection model using data mining technology have been dedicated to detection accuracy improvement. However, the size of intrusion detection model (e.g. detection rules) is as important as detection accuracy. In this paper, a method sIDMG is proposed for small-size intrusion detection model generation by using our classification algorithm sC4.5. We also propose an algorithm sC4.5 for small-size decision tree induction for a specific data by complimenting the split-attribute selection criteria of C4.5 during the tree induction. The approach of sC4.5 is to select the next highest gain ratio attribute as the split attribute if the training data set is satisfied with bias properties of C4.5. The results of performance evaluation gives that sC4.5 preserves detection accuracy of C4.5 but the decision tree size of sC4.5 is smaller than the existing C4.5.


Decision Tree Intrusion Detection Information Gain Candidate Attribute Gain Ratio 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lee, W., Stolfo, S.J.: A Framework For Constructing Features and Models For Intrusion Detection Systems. ACM Transactions on Information and System Security 3(4), 227–261 (2000)CrossRefGoogle Scholar
  2. 2.
    Fayyad, U., Haussler, D., Stolorz, P.: Mining scientific data. Communication so of the ACM 39(11) (1996)Google Scholar
  3. 3.
    Dunham, M.H.: Data Mining: Introductory and Advanced Topics. Prentice-Hall, Englewood Cliffs (2002)Google Scholar
  4. 4.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)zbMATHGoogle Scholar
  5. 5.
    Murthy, S.K.: Automatic construction of decision trees from data: A multi-disciplinary survey. Data Mining and Knowledge Discovery 2(4), 345–389 (1998)CrossRefGoogle Scholar
  6. 6.
    Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO Algorithms for SVM Classifier Design. Neural Computation 13(3), 637–649 (2001)zbMATHCrossRefGoogle Scholar
  7. 7.
    Plattt, J.: Fast Training of Support Vector Machines using Sequential Minimal Optimization. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances In Kernel Methods - Support Vector Learning, MIT Press, Cambridge (1998)Google Scholar
  8. 8.
    Aha, D., Kibler, D.: Instance-based learning algorithms. Machine Learning 6, 37–66 (1991)Google Scholar
  9. 9.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, New York (1995)Google Scholar
  10. 10.
    Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)zbMATHGoogle Scholar
  11. 11.
    Cheeseman, P., Kelly, J., Self, M., et al.: AutoClass: A Bayesian classification system. In: 5th Int’l Conf. on Machine Learning, Morgan Kaufmann, San Francisco (1988)Google Scholar
  12. 12.
    Brachman, R.J., Khabaza, T., Kloesgen, W., Shapiro, G.P., Simoudis, E.: Mining business databases. Communications of the ACM 39(11), 42–48 (1996)CrossRefGoogle Scholar
  13. 13.
    Inman, W.H.: The data warehouse and data mining. Communications of the ACM 39(11) (1996)Google Scholar
  14. 14.
    Brown, D.E., Corruble, V., Pittard, C.I.: A Comparison of Decision Tree Classifiers with Backpropagation neural networks for multimodal Classification Problems. Pattern Recognition C 26, 953–961 (1993)CrossRefGoogle Scholar
  15. 15.
    Lim, T.-S., Loh, W.-Y., Shih, Y.-S.: A Comparison of Prediction Accuracy, Complexity, and training time of Thirty-three old and new classification algorithms. Machine Learning C 40, 203–228 (2000)CrossRefGoogle Scholar
  16. 16.
    Quinlan, J.R.: An emprical comparison of genetic and decision-tree classfiers. In: Proc. 5th Int’l Conf. Machine Learning, San Mateo, CA, pp. 135–141 (1998)Google Scholar
  17. 17.
    Gehrke, J., Ganti, V., Ramakrishnan, R., Loh, W.-Y.: BOAT.optimistic decision tree construction. In: Proceedings of the 1999 ACM SIGMOD International Conference on Management of Data, Philadelphia, Pennsylvania (1999)Google Scholar
  18. 18.
    Quinlan, J.R.: Induction of decision trees. Machine Learning 1, 81–106 (1986)Google Scholar
  19. 19.
    Ruggieri, S.: Efficient C4.5. IEEE Transaction on Knowledge and Data Engineering 14(2), 438–444 (2002)CrossRefGoogle Scholar
  20. 20.
    Mitchell, T.M.: Machine Learning. McGraw Hill, New York (1997)zbMATHGoogle Scholar
  21. 21.
    Hyafil, L., Rivest, R.L.: Constructing Optimal Binary Decision Trees is NP-Complete. Information Processing Letters 5(1), 15–17 (1976)zbMATHCrossRefMathSciNetGoogle Scholar
  22. 22.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
  23. 23.
    Mehta, M., Rissanen, J., Agrawal, R.: MDL-based decision tree pruning. In: Proc. of the 1st Int’l Conf. on Knowledge Discovery and Data Mining, Montreal, Canada (1995)Google Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Seung-Hyun Paek
    • 1
  • Yoon-Keun Oh
    • 1
  • Do-Hoon Lee
    • 1
  1. 1.National Security Research Institute, 161 Gajeong-dong Yuseong-gu DaejeonKorea

Personalised recommendations