Skip to main content

Method of Static Classifiers Selection Using the Weights of Base Classifiers

  • Chapter
  • First Online:
Soft Computing in Computer and Information Science

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 342))

  • 637 Accesses

Abstract

The choice of a pertinent objective function is one of the most crucial elements in static ensemble selection. In this study, a new approach of calculating the weight of base classifiers is developed. The values of these weights are the basis for the selection process of classifiers from the initial pool. The obtained weights are interpreted in the context of the interval logic. A number of experiments have been carried out on several datasets available in the UCI repository. The performed experiments compare the proposed algorithms with base classifiers, oracle, sum, product, and mean methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer, Secaucus (2006)

    Google Scholar 

  2. Cavalin, P.R., Sabourin, R., Suen, C.Y.: Dynamic selection approaches for multiple classifier systems. Neural Comput. Appl. 22(3–4), 673–688 (2013)

    Article  Google Scholar 

  3. Cyganek, B.: One-class support vector ensembles for image segmentation and classification. J. Math. Imaging Vis. 42(2–3), 103–117 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  4. Didaci, L., Giacinto, G., Roli, F., Marciali, G.L.: A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognition, 28, 2188–2191, 11/2005 (2005)

    Google Scholar 

  5. dos Santos, E.M., Sabourin, R.: Classifier ensembles optimization guided by population oracle. In: IEEE Congress on Evolutionary Computation, pp. 693–698 (2011)

    Google Scholar 

  6. Duin, R., Juszczak, P., Paclik, P., Pekalska, E., de Ridder, D., Tax, D., Verzakov. S.: PR-Tools4.1, A Matlab Toolbox for Pattern Recognition. Delft University of Technology (2007)

    Google Scholar 

  7. Frank, A., Asuncion, A.: UCI machine learning repository Irvine CA (2010) http://archive.ics.uci.edu/ml

  8. Giacinto, G., Roli, F.: An approach to the automatic design of multiple classifier systems. Pattern Recognit. Lett. 22, 25–33 (2001)

    Article  MATH  Google Scholar 

  9. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  10. Highleyman, W.H.: The design and analysis of pattern recognition experiments. Bell Syst. Tech. J. 41, 723–744 (1962)

    Article  Google Scholar 

  11. Ho, T.K., Hull, J.J., Srihari, S.N.: Decision combination in multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 16(1), 66–75 (1994)

    Article  Google Scholar 

  12. Jackowski, K., Krawczyk, B., Woźniak, M.: Improved adaptive splitting and selection: the hybrid training method of a classifier based on a feature space partitioning. Int. J. Neural Syst. 24(03) (2014)

    Google Scholar 

  13. Jackowski, K., Wozniak, M.: Method of classifier selection using the genetic approach. Expert Syst. 27(2), 114–128 (2010)

    Article  Google Scholar 

  14. Kittler, J., Alkoot, F.M.: Sum versus vote fusion in multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 25(1), 110–115 (2003)

    Article  Google Scholar 

  15. Kuncheva, L.I.: A theoretical study on six classifier fusion strategies. IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 281–286 (2002)

    Article  Google Scholar 

  16. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley New York (2014)

    Google Scholar 

  17. Lam, L., Suen, C.Y.: Application of majority voting to pattern recognition: an analysis of its behavior and performance. IEEE Trans. Syst. Man, Cybern, Part A 27(5), 553–568 (1997)

    Article  Google Scholar 

  18. Ranawana, R., Palade, V.: Multi-classifier systems: review and a roadmap for developers. Int. J. Hybrid Intell. Syst. 3(1), 35–61 (2006)

    MATH  Google Scholar 

  19. Rejer, I.: Genetic algorithms in EEG feature selection for the classification of movements of the left and right hand. In: Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013, pp. 579–589. Springer (2013)

    Google Scholar 

  20. Ruta, D., Gabrys, B.: Classifier selection for majority voting. Inf. Fusion 6(1), 63–81 (2005)

    Article  Google Scholar 

  21. Smetek, M., Trawinski, B.: Selection of heterogeneous fuzzy model ensembles using self-adaptive genetic algorithms. New Gener. Comput. 29(3), 309–327 (2011)

    Article  Google Scholar 

  22. Suen, C.Y., Legault, R., Nadal, C.P., Cheriet, M., Lam, L.: Building a new generation of handwriting recognition systems. Pattern Recognit. Lett. 14(4), 303–315 (1993)

    Article  Google Scholar 

  23. Trawinski, K., Cordon, O., Quirin, A.: A study on the use of multiobjective genetic algorithms for classifier selection in Furia-based fuzzy multiclassifiers. Int. J. Comput. Intell. Syst. 5(2), 231–253 (2012)

    Article  Google Scholar 

  24. Ulas, A., Semerci, M., Yildiz, O.T., Alpaydin, E.: Incremental construction of classifier and discriminant ensembles. Inf. Sci. 179(9), 1298–1318 (2009)

    Article  Google Scholar 

  25. Woloszynski, T., Kurzynski, M.: A probabilistic model of classifier competence for dynamic ensemble selection. Pattern Recognit. 44(10–11), 2656–2668 (2011)

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work was supported by the Polish National Science Center under the grant no. DEC-2013/09/B/ST6/02264 and by the statutory funds of the Department of Systems and Computer Networks, Wroclaw University of Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert Burduk .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Burduk, R. (2015). Method of Static Classifiers Selection Using the Weights of Base Classifiers. In: Wiliński, A., Fray, I., Pejaś, J. (eds) Soft Computing in Computer and Information Science. Advances in Intelligent Systems and Computing, vol 342. Springer, Cham. https://doi.org/10.1007/978-3-319-15147-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-15147-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-15146-5

  • Online ISBN: 978-3-319-15147-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics