Skip to main content

A Study for Semi-supervised Learning with Random Erasing

  • Conference paper
  • First Online:
  • 913 Accesses

Part of the book series: Lecture Notes on Data Engineering and Communications Technologies ((LNDECT,volume 47))

Abstract

Due to the recent popularization of various data classified by computer, machine learning is attracting great attention. A common method of machine learning is supervised learning, which classifies data using a large number of class labeled training data called labeled data. To improve the processing performance of supervised learning, it is effective to use Random Erasing in data augmentation. However, since supervised learning requires much labeled data, the cost of manually adding label information to an unclassified training case (unlabeled data) is very high. In this paper, we propose a method for achieving high classification accuracy using Random Erasing for semi-supervised learning using few labeled data and unlabeled data. In our evaluation, we confirm the availability of the proposed method compared with conventional methods.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Rasmus, A., Valpola, H., Honkala, M., Berglund, M., Raiko, T.: Semi-supervised learning with ladder networks. In: Proceeding of the 35th Advances in Neural Information Processing Systems (NIPS 2015), vol. 2, pp. 3546–3554 (2015)

    Google Scholar 

  2. Han, X., Kashif, R., Roland, V.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint, arXiv: 1708.07747 (2017)

  3. He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. arXiv preprint arXiv: 1512.03385 (2015)

  4. Hoffer, E., Aillon, N.: Deep metric learning using triplet network. In: Proceeding of the 3rd International Conference on Learning Representations (ICLR 2015). arXiv preprint arXiv: 1412.6622 (2015)

  5. Hoffer, E., Ailon, N.: Semi-supervised deep learning by metric embedding. In: Proceeding of the 5th International Conference on Learning Representations (ICLR 2017). arXiv preprint arXiv: 1611.01449 (2016)

  6. Joachims, T.: Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning, pp. 200–209 (1999)

    Google Scholar 

  7. Kingma, D., Mohamed, S., Rezende, D.J., Welling, M.: Semi-supervised learning with deep generative models. In: Proceeding of the 34th Advances in Neural Information Processing Systems (NIPS 2014), pp. 3581–3589 (2014)

    Google Scholar 

  8. Miyato, T., Maeda, S., Koyama, M., Nakae, K., Ishii, S.: Distributional Smoothing with Virtual Adversarial Training. arXiv preprint, arXiv: 1507.00677 (2016)

  9. Miyato, T., Maeda, S., Koyama, M., Ishii, S.: Virtual adversarial training: a regularization method for supervised and semi-supervised learning. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 1979–1993 (2019)

    Article  Google Scholar 

  10. Park, S., Park, J.K., Shin, S.J., Moon, C.: Adversarial dropout for supervised and semi-supervised learning. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, pp. 3917–3924 (2018)

    Google Scholar 

  11. Sousa, C., Rezende, S., Batista, G.: Influence of graph construction on semi-supervised learning. In: Proceeding of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2013), vol. 8190, pp. 160–175 (2013)

    Google Scholar 

  12. The MNIST handwritten digit database (2017). http://yann.lecun.com/exdb/mnist/

  13. Zhong, Z., Zheng, L., Kang, G., Li, S., Yang, Y.: Random Erasing Data Augmentation. arXiv preprint, arXiv: 1708.04896 (2017)

  14. Zhou, D., Bousquet, O., Navin, T., Weston, J., Scholkopf, B.: Learning with local and global consistency. In: Proceeding of the 24th Advances in Neural Information Processing Systems (NIPS 2004), pp. 321–328 (2004)

    Google Scholar 

  15. Zhu, X., Ghahramani, Z.: Learning from Labeled and Unlabeled Data with Label Propagation. Carnegie Mellon University CALD Technical Reports, CMU-CALD-02-107, pp. 1–17 (2002)

    Google Scholar 

  16. Zhu, X., Ghahramani, Z.: Semi-supervised learning using gaussian fields and harmonic functions. In: Proceedings of the 21th International Conference on Machine Learning (ICML 2003), pp. 912–919 (2003)

    Google Scholar 

Download references

Acknowledgement

This work was supported by JSPS KAKENHI Grant Number 18K11265. In addition, this paper is partially supported by Innovation Platform for Society 5.0 from Japan Ministry of Education, Culture, Sports, Science and Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yusuke Gotoh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Okahana, Y., Gotoh, Y. (2020). A Study for Semi-supervised Learning with Random Erasing. In: Barolli, L., Okada, Y., Amato, F. (eds) Advances in Internet, Data and Web Technologies. EIDWT 2020. Lecture Notes on Data Engineering and Communications Technologies, vol 47. Springer, Cham. https://doi.org/10.1007/978-3-030-39746-3_49

Download citation

Publish with us

Policies and ethics