Skip to main content

The Effect of Biased Sampling in Radial Basis Function Networks for Data Mining

  • Conference paper
Book cover Multimedia, Computer Graphics and Broadcasting (MulGraB 2011)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 262))

  • 896 Accesses

Abstract

Radial basis function (RBF) networks are known to have very good performance in the task of data mining of classification, and k-means clustering algorithm is often used to determine the centers and radii of the radial basis functions of the networks. Among many parameters the performance of generated RBF networks depends upon given training data sets very much, so we want to find some better classification models from the given data set. We used biased samples as well as conventional samples to find better classification models of RBF networks. Experiments with real world data sets showed successful results that biased samples could find some better knowledge models in some classes and conventional samples also could find some better knowledge models in some other classes so that we can take advantage of the results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tan, P., Steinbach, M., Kumar, V.: Introduction to Data Mining. Addison Wesley (2006)

    Google Scholar 

  2. Bishop, C.M.: Neural networks for pattern recognition. Oxford University Press (1995)

    Google Scholar 

  3. Heaton, J.: Introduction to Neural Networks for C#, 2nd edn. Heaton Research Inc. (2008)

    Google Scholar 

  4. Lippmann, R.P.: An Introduction to Computing with Neural Nets. IEEE ASSP Magazine 3(4), 4–22 (1987)

    Article  Google Scholar 

  5. Howlett, R.J., Jain, L.C.: Radial Basis Function Networks I: recent developments in theory and applications. Physics-Verlag (2001)

    Google Scholar 

  6. Coulomb, J., Kobetski, A., Costa, M.C., Maréchal, Y., Jösson, U.: Comparison of radial basis function approximation techniques. The International Journal for Computation and Mathematics in Electrical and Electronic Engineering 22(3), 616–629 (2003)

    Article  MATH  Google Scholar 

  7. Russel, S., Novig, P.: Artificial Intelligence: a Modern Approach, 3rd edn. Prentice Hall (2009)

    Google Scholar 

  8. Orr, M.J.L.: Introduction to Radial Basis Function Networks, http://www.anc.ed.ac.uk/~mjo/intro.ps

  9. Kubat, M.: Decision Trees Can Initialize Radial-Basis Function Networks. IEEE Transactions on Neural Networks 9(5), 813–821 (1998)

    Article  MathSciNet  Google Scholar 

  10. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, Inc. (1993)

    Google Scholar 

  11. Perner, P., Zscherpel, U., Zacobsen, C.: A comparison between neural networks and decision trees based on data from industrial radiographic testing. Pattern Recognition Letters, 47–54 (2001)

    Google Scholar 

  12. Fukunaga, K., Hayes, R.R.: Effects of Sample Size in Classifier Design. IEEE Transactions on Pattern Analysis and Machine Intelligence 11(8), 873–885 (1989)

    Article  Google Scholar 

  13. Mazuro, M.A., Habas, P.A., Zurada, J.M., Lo, J.Y., Baker, J.A., Tourassi, G.D.: Training neural network classifiers for medical decision making: The effects of imbalanced datasets on classification performance. Neural Networks 21(2-3), 427–436 (2008)

    Article  Google Scholar 

  14. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 341–378 (2002)

    MATH  Google Scholar 

  15. Suncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Sciences, Irvine (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

  16. Kohavi, R.: Scaling up the accuracy of Naive-Bayes classifiers: a decision-tree hybrid. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, pp. 202–207 (1996)

    Google Scholar 

  17. Statlog (Landsat Satellite) Data Set, http://archive.ics.uci.edu/ml/datasets/Statlog+%28Landsat+Satellite%29

  18. Witten, I.H., Frank, E.: Data Mining, 3rd edn. Morgan Kaufmann (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sug, H. (2011). The Effect of Biased Sampling in Radial Basis Function Networks for Data Mining. In: Kim, Th., et al. Multimedia, Computer Graphics and Broadcasting. MulGraB 2011. Communications in Computer and Information Science, vol 262. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27204-2_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-27204-2_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-27203-5

  • Online ISBN: 978-3-642-27204-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics