Abstract
Radial basis function (RBF) networks are known to have very good performance in the task of data mining of classification, and k-means clustering algorithm is often used to determine the centers and radii of the radial basis functions of the networks. Among many parameters the performance of generated RBF networks depends upon given training data sets very much, so we want to find some better classification models from the given data set. We used biased samples as well as conventional samples to find better classification models of RBF networks. Experiments with real world data sets showed successful results that biased samples could find some better knowledge models in some classes and conventional samples also could find some better knowledge models in some other classes so that we can take advantage of the results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Tan, P., Steinbach, M., Kumar, V.: Introduction to Data Mining. Addison Wesley (2006)
Bishop, C.M.: Neural networks for pattern recognition. Oxford University Press (1995)
Heaton, J.: Introduction to Neural Networks for C#, 2nd edn. Heaton Research Inc. (2008)
Lippmann, R.P.: An Introduction to Computing with Neural Nets. IEEE ASSP Magazine 3(4), 4–22 (1987)
Howlett, R.J., Jain, L.C.: Radial Basis Function Networks I: recent developments in theory and applications. Physics-Verlag (2001)
Coulomb, J., Kobetski, A., Costa, M.C., Maréchal, Y., Jösson, U.: Comparison of radial basis function approximation techniques. The International Journal for Computation and Mathematics in Electrical and Electronic Engineering 22(3), 616–629 (2003)
Russel, S., Novig, P.: Artificial Intelligence: a Modern Approach, 3rd edn. Prentice Hall (2009)
Orr, M.J.L.: Introduction to Radial Basis Function Networks, http://www.anc.ed.ac.uk/~mjo/intro.ps
Kubat, M.: Decision Trees Can Initialize Radial-Basis Function Networks. IEEE Transactions on Neural Networks 9(5), 813–821 (1998)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, Inc. (1993)
Perner, P., Zscherpel, U., Zacobsen, C.: A comparison between neural networks and decision trees based on data from industrial radiographic testing. Pattern Recognition Letters, 47–54 (2001)
Fukunaga, K., Hayes, R.R.: Effects of Sample Size in Classifier Design. IEEE Transactions on Pattern Analysis and Machine Intelligence 11(8), 873–885 (1989)
Mazuro, M.A., Habas, P.A., Zurada, J.M., Lo, J.Y., Baker, J.A., Tourassi, G.D.: Training neural network classifiers for medical decision making: The effects of imbalanced datasets on classification performance. Neural Networks 21(2-3), 427–436 (2008)
Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 341–378 (2002)
Suncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Sciences, Irvine (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Kohavi, R.: Scaling up the accuracy of Naive-Bayes classifiers: a decision-tree hybrid. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, pp. 202–207 (1996)
Statlog (Landsat Satellite) Data Set, http://archive.ics.uci.edu/ml/datasets/Statlog+%28Landsat+Satellite%29
Witten, I.H., Frank, E.: Data Mining, 3rd edn. Morgan Kaufmann (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sug, H. (2011). The Effect of Biased Sampling in Radial Basis Function Networks for Data Mining. In: Kim, Th., et al. Multimedia, Computer Graphics and Broadcasting. MulGraB 2011. Communications in Computer and Information Science, vol 262. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27204-2_36
Download citation
DOI: https://doi.org/10.1007/978-3-642-27204-2_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-27203-5
Online ISBN: 978-3-642-27204-2
eBook Packages: Computer ScienceComputer Science (R0)