Nonlinear Rule Based Ensemble Methods for the Prediction of Number of Faults

  • Santosh Singh RathoreEmail author
  • Sandeep Kumar
Part of the SpringerBriefs in Computer Science book series (BRIEFSCOMPUTER)


In the previous chapter, we explored the use of linear rule based ensemble methods for the number of faults prediction. In that work, we used four different ensemble methods, each of them combines the outputs of base learners in a linear form. Results of experimental analysis showed that a stable and accurate fault prediction performance could be achieved using linear rule based ensemble methods. However, these ensemble methods capture only the weighted contributions of base learners and combine them in linear way, which may sometimes suffers from the linearity error problem of fitting in a straight line (Fox in Regression diagnostics: an introduction. Sage, 1991 [1]).


  1. 1.
    Fox, J.: Regression Diagnostics: An Introduction, vol. 79. Sage (1991)Google Scholar
  2. 2.
    Rathore, S.S., Kumar, S.: Linear and non-linear heterogeneous ensemble methods to predict the number of faults in software systems. Knowl.-Based Syst. 119, 232–256 (2017)CrossRefGoogle Scholar
  3. 3.
    Todorovski, L., Dzeroski, S.: Combining classifiers with meta decision trees. Mach. Learn. 50(3), 223–249 (2003)CrossRefGoogle Scholar
  4. 4.
    Webb, G.I., Zheng, Z.: Multistrategy ensemble learning: Reducing error by combining ensemble learning techniques. IEEE Trans. Knowl. Data Eng. 16(8), 980–991 (2004)CrossRefGoogle Scholar
  5. 5.
    Boetticher, G.: The PROMISE repository of empirical software engineering data (2007).
  6. 6.
    Agrawal, R., Srikant, R.: Privacy-preserving data mining. ACM 29(2), 439–450 (2000)Google Scholar
  7. 7.
    Breiman, L.: Stacked regressions. Mach. Learn. 24(1), 49–64 (1996)zbMATHGoogle Scholar
  8. 8.
    Fiaschi, L., Köthe, U., Nair, R., Hamprecht, F.A.: Learning to count with regression forest and structured labels. In: Proceedings of 21st International Conference on Pattern Recognition (ICPR), pp. 2685–2688 (2012)Google Scholar
  9. 9.
    Criminisi, A., Shotton, J., Konukoglu, E.: Decision forests: A unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning. Found. Trends Comput. Graph. Vis. 7(2–3), 81–227 (2012)zbMATHGoogle Scholar
  10. 10.
    Liaw, A., Wiener, M.: Classification and regression by random forest. R News 2(3), 18–22 (2002)Google Scholar
  11. 11.
    Friedman, J.H., Meulman, J.J.: Multiple additive regression trees with application in epidemiology. Stat. Med. 22(9), 1365–1381 (2003)CrossRefGoogle Scholar
  12. 12.
    Elish, M.O.: Improved estimation of software project effort using multiple additive regression trees. Expert Syst. Appl. 36(7), 10774–10778 (2009)CrossRefGoogle Scholar
  13. 13.
    Stone, C.J.: Additive regression and other nonparametric models. Ann. Stat. 689–705 (1985)Google Scholar
  14. 14.
    Witten, I.H., Frank E., Hall, M.A., Pal, C.J.: Data Mining: Practical Machine Learning Tools and Techniques. Morgan KaufmannGoogle Scholar

Copyright information

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringABV-Indian Institute of Information Technology and Management GwaliorGwaliorIndia
  2. 2.Department of Computer Science and EngineeringIndian Institute of Technology RoorkeeRoorkeeIndia

Personalised recommendations