Skip to main content

Classification Based on Combination of Kernel Density Estimators

  • Conference paper
Book cover Artificial Neural Networks – ICANN 2009 (ICANN 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5769))

Included in the following conference series:

Abstract

A new classification algorithm based on combination of kernel density estimators is introduced. The method combines the estimators with different bandwidths what can be interpreted as looking at the data with different “resolutions” which, in turn, potentially gives the algorithm an insight into the structure of the data. The bandwidths are adjusted automatically to decrease the classification error. Results of the experiments using benchmark data sets show promising performance of the proposed approach when compared to classical algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience Publication, Hoboken (2000)

    MATH  Google Scholar 

  2. Smyth, P., Wolpert, D.: Linearly combining density estimators via stacking. Machine Learning 36, 59–83 (1999)

    Article  Google Scholar 

  3. Marchette, D.J., Priebe, C.E., Rogers, G.W., Solka, J.L.: Filtered kernel density estimation. Computational Statistics 11, 95–112 (1996)

    MATH  Google Scholar 

  4. Di Marzio, M., Taylor, C.C.: On boosting kernel density methods for multivariate data: density estimation and classification. Statistical Methods and Applications 14, 163–178 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  5. Ormoneit, D., Tresp, V.: Averaging, maximum penalized likelihood and bayesian estimation for improving gaussian mixture probability density estimates. IEEE Transactions on Neural Networks 9, 639–650 (1998)

    Article  Google Scholar 

  6. Ridgeway, G.: Looking for lumps: boosting and bagging for density estimation. Computational Statsistics and Data Analysis 38, 379–392 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Cooley, C.A., MacEachern, S.N.: Classification via kernel product estimators. Biometrika 85, 823–833 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Ghosh, A.K., Chaudhuri, P., Sengupta, D.: Classification using kernel density estimates: Multiscale analysis and visualization. Technometrics 48, 120–132 (2006)

    Article  MathSciNet  Google Scholar 

  9. Scott, D.W.: Multivariate Density Estimation: Theory, Practice, and Visualization. Wiley, New York (1992)

    Book  MATH  Google Scholar 

  10. Lim, T.S., Loh, W.Y., Shih, Y.S.: A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Machine Learning 40, 203–228 (2000)

    Article  MATH  Google Scholar 

  11. Mangasarian, O., Wolberg, W.: Cancer diagnosis via linear programming. Siam News 23, 1–18 (1990)

    Google Scholar 

  12. Asuncion, A., Newman, D.: UCI Machine Learning Repository (2007)

    Google Scholar 

  13. Ripley, B.: Pattern recognition and neural networks datasets collection (1996), http://www.stats.ox.ac.uk/pub/PRNN/

  14. R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2008)

    Google Scholar 

  15. Dendek, C., Mańdziuk, J.: Improving performance of a binary classifier by training set selection. In: Kůrková, V., Neruda, R., Koutník, J. (eds.) ICANN 2008, Part I. LNCS, vol. 5163, pp. 128–135. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kobos, M., Mańdziuk, J. (2009). Classification Based on Combination of Kernel Density Estimators. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5769. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04277-5_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04277-5_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04276-8

  • Online ISBN: 978-3-642-04277-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics