Skip to main content

Generating Balanced Classifier-Independent Training Samples from Unlabeled Data

  • Conference paper
Advances in Knowledge Discovery and Data Mining (PAKDD 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7301))

Included in the following conference series:

Abstract

We consider the problem of generating balanced training samples from an unlabeled data set with an unknown class distribution. While random sampling works well when the data is balanced, it is very ineffective for unbalanced data. Other approaches, such as active learning and cost-sensitive learning, are also suboptimal as they are classifier-dependent, and require misclassification costs and labeled samples. We propose a new strategy for generating training samples which is independent of the underlying class distribution of the data and the classifier that will be trained using the labeled data.

Our methods are iterative and can be seen as variants of active learning, where we use semi-supervised clustering at each iteration to perform biased sampling from the clusters. Several strategies are provided to estimate the underlying class distributions in the clusters and increase the balancedness in the training samples. Experiments with both highly skewed and balanced data from the UCI repository and a private data show that our algorithm produces much more balanced samples than random sampling or uncertainty sampling. Further, our sampling strategy is substantially more efficient than active learning methods. The experiments also validate that, with more balanced training data, classifiers trained with our samples outperform classifiers trained with random sampling or active learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jo, T., Japkowicz, N.: Class imbalances versus small disjuncts. SIGKDD Explorations 6(1) (2004)

    Google Scholar 

  2. Weiss, G., Provost, F.: The effect of class distribution on classifier learning: An empirical study. Dept. of Comp. Science, Rutgers University, Tech. Rep. ML-TR-43 (2001)

    Google Scholar 

  3. Zadrozny, B.: Learning and evaluating classifiers under sample selection bias. In: ICML (2004)

    Google Scholar 

  4. Dasgupta, S., Hsu, D.: Hierarchical sampling for active learning. In: ICML (2008)

    Google Scholar 

  5. Ertekin, S., Huang, J., Bottou, L., Giles, C.L.: Learning on the border: active learning in imbalanced data classification. In: CIKM (2007)

    Google Scholar 

  6. Settles, B.: Active learning literature survey. University of Wisconsin-Madison, Tech. Rep. (2009)

    Google Scholar 

  7. Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. Journal of Machine Learning Research 6 (2005)

    Google Scholar 

  8. Wagstaff, K., Cardie, C.: Clustering with instance-level constraints. In: ICML (2000)

    Google Scholar 

  9. Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Info. Proc. Systems, vol. 15. MIT Press (2003)

    Google Scholar 

  10. Provos, N., Mavrommatis, P., Rajab, M., Monrose, F.: All your iFRAMEs point to us. Google, Tech. Rep. (2008)

    Google Scholar 

  11. Frank, A., Asuncion, A.: UCI machine learning repository

    Google Scholar 

  12. Campbell, C., Cristianini, N., Smola, A.J.: Query learning with large margin classifiers. In: ICML (2000)

    Google Scholar 

  13. Freund, Y., Seung, H.S., Shamir, E., Tishby, N.: Selective sampling using the query by committee algorithm. Machine Learning 28(2-3) (1997)

    Google Scholar 

  14. Tong, S., Koller, D.: Support vector machine active learning with application sto text classification. In: ICML (2000)

    Google Scholar 

  15. Lewis, D.D., Gale, W.A.: A sequential algorithm for training text classifiers. In: SIGIR (1994)

    Google Scholar 

  16. Seung, H.S., Opper, M., Sompolinsky, H.: Query by committee. In: Computational Learning Theory (1992)

    Google Scholar 

  17. Hoi, S.C.H., Jin, R., Zhu, J., Lyu, M.R.: Batch mode active learning and its application to medical image classification. In: ICML (2006)

    Google Scholar 

  18. Guo, Y., Schuurmans, D.: Discriminative batch mode active learning. In: NIPS (2007)

    Google Scholar 

  19. Schohn, G., Cohn, D.: Less is more: Active learning with support vector machines. In: ICML (2000)

    Google Scholar 

  20. Xu, Z., Hogan, C., Bauer, R.: Greedy is not enough: An efficient batch mode active learning algorithm. In: ICDM Workshops (2009)

    Google Scholar 

  21. Liu, X.Y., Wu, J., Zhou, Z.H.: Exploratory undersampling for class imbalance learning. In: IEEE Trans. on Sys. Man. and Cybernetics (2009)

    Google Scholar 

  22. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: Smote: Synthetic minority over-sampling technique. JAIR 16 (2002)

    Google Scholar 

  23. Tomanek, K., Hahn, U.: Reducing class imbalance during active learning for named entity recognition. In: K-CAP (2009)

    Google Scholar 

  24. Zhu, J., Hovy, E.: Active learning for word sense disambiguation with methods for dddressing the class imbalance problem. In: EMNLP-CoNLL (2007)

    Google Scholar 

  25. wU, Y., Zhang, R., Rudnicky, E.: Data selection for speech recognition. In: ASRU (2007)

    Google Scholar 

  26. Wagstaff, K., Cardie, C., Rogers, S., Schrödl, S.: Constrained k-means clustering with background knowledge. In: ICML (2001)

    Google Scholar 

  27. Shortliffe, E.H., Buchanan, B.G.: A model of inexact reasoning in medicine. Mathematical Biosciences 23(3-4) (1975)

    Google Scholar 

  28. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley Interscience (1991)

    Google Scholar 

  29. Mierswa, I., Wurst, M., Klinkenberg, R., Scholz, M., Euler, T.: Yale: Rapid prototyping for complex data mining tasks. In: Proc. KDD (2006)

    Google Scholar 

  30. Rifkin, R.M., Klautau, A.: In defense of one-vs-all classification. J. Machine Learning (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Park, Y., Qi, Z., Chari, S.N., Molloy, I.M. (2012). Generating Balanced Classifier-Independent Training Samples from Unlabeled Data. In: Tan, PN., Chawla, S., Ho, C.K., Bailey, J. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2012. Lecture Notes in Computer Science(), vol 7301. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30217-6_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-30217-6_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-30216-9

  • Online ISBN: 978-3-642-30217-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics