Abstract
A new classification algorithm based on combination of kernel density estimators is introduced. The method combines the estimators with different bandwidths what can be interpreted as looking at the data with different “resolutions” which, in turn, potentially gives the algorithm an insight into the structure of the data. The bandwidths are adjusted automatically to decrease the classification error. Results of the experiments using benchmark data sets show promising performance of the proposed approach when compared to classical algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience Publication, Hoboken (2000)
Smyth, P., Wolpert, D.: Linearly combining density estimators via stacking. Machine Learning 36, 59–83 (1999)
Marchette, D.J., Priebe, C.E., Rogers, G.W., Solka, J.L.: Filtered kernel density estimation. Computational Statistics 11, 95–112 (1996)
Di Marzio, M., Taylor, C.C.: On boosting kernel density methods for multivariate data: density estimation and classification. Statistical Methods and Applications 14, 163–178 (2005)
Ormoneit, D., Tresp, V.: Averaging, maximum penalized likelihood and bayesian estimation for improving gaussian mixture probability density estimates. IEEE Transactions on Neural Networks 9, 639–650 (1998)
Ridgeway, G.: Looking for lumps: boosting and bagging for density estimation. Computational Statsistics and Data Analysis 38, 379–392 (2002)
Cooley, C.A., MacEachern, S.N.: Classification via kernel product estimators. Biometrika 85, 823–833 (1998)
Ghosh, A.K., Chaudhuri, P., Sengupta, D.: Classification using kernel density estimates: Multiscale analysis and visualization. Technometrics 48, 120–132 (2006)
Scott, D.W.: Multivariate Density Estimation: Theory, Practice, and Visualization. Wiley, New York (1992)
Lim, T.S., Loh, W.Y., Shih, Y.S.: A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Machine Learning 40, 203–228 (2000)
Mangasarian, O., Wolberg, W.: Cancer diagnosis via linear programming. Siam News 23, 1–18 (1990)
Asuncion, A., Newman, D.: UCI Machine Learning Repository (2007)
Ripley, B.: Pattern recognition and neural networks datasets collection (1996), http://www.stats.ox.ac.uk/pub/PRNN/
R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2008)
Dendek, C., Mańdziuk, J.: Improving performance of a binary classifier by training set selection. In: Kůrková, V., Neruda, R., Koutník, J. (eds.) ICANN 2008, Part I. LNCS, vol. 5163, pp. 128–135. Springer, Heidelberg (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kobos, M., Mańdziuk, J. (2009). Classification Based on Combination of Kernel Density Estimators. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5769. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04277-5_13
Download citation
DOI: https://doi.org/10.1007/978-3-642-04277-5_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04276-8
Online ISBN: 978-3-642-04277-5
eBook Packages: Computer ScienceComputer Science (R0)