Abstract
Due to the recent popularization of various data classified by computer, machine learning is attracting great attention. A common method of machine learning is supervised learning, which classifies data using a large number of class labeled training data called labeled data. To improve the processing performance of supervised learning, it is effective to use Random Erasing in data augmentation. However, since supervised learning requires much labeled data, the cost of manually adding label information to an unclassified training case (unlabeled data) is very high. In this paper, we propose a method for achieving high classification accuracy using Random Erasing for semi-supervised learning using few labeled data and unlabeled data. In our evaluation, we confirm the availability of the proposed method compared with conventional methods.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Rasmus, A., Valpola, H., Honkala, M., Berglund, M., Raiko, T.: Semi-supervised learning with ladder networks. In: Proceeding of the 35th Advances in Neural Information Processing Systems (NIPS 2015), vol. 2, pp. 3546–3554 (2015)
Han, X., Kashif, R., Roland, V.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint, arXiv: 1708.07747 (2017)
He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. arXiv preprint arXiv: 1512.03385 (2015)
Hoffer, E., Aillon, N.: Deep metric learning using triplet network. In: Proceeding of the 3rd International Conference on Learning Representations (ICLR 2015). arXiv preprint arXiv: 1412.6622 (2015)
Hoffer, E., Ailon, N.: Semi-supervised deep learning by metric embedding. In: Proceeding of the 5th International Conference on Learning Representations (ICLR 2017). arXiv preprint arXiv: 1611.01449 (2016)
Joachims, T.: Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning, pp. 200–209 (1999)
Kingma, D., Mohamed, S., Rezende, D.J., Welling, M.: Semi-supervised learning with deep generative models. In: Proceeding of the 34th Advances in Neural Information Processing Systems (NIPS 2014), pp. 3581–3589 (2014)
Miyato, T., Maeda, S., Koyama, M., Nakae, K., Ishii, S.: Distributional Smoothing with Virtual Adversarial Training. arXiv preprint, arXiv: 1507.00677 (2016)
Miyato, T., Maeda, S., Koyama, M., Ishii, S.: Virtual adversarial training: a regularization method for supervised and semi-supervised learning. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 1979–1993 (2019)
Park, S., Park, J.K., Shin, S.J., Moon, C.: Adversarial dropout for supervised and semi-supervised learning. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, pp. 3917–3924 (2018)
Sousa, C., Rezende, S., Batista, G.: Influence of graph construction on semi-supervised learning. In: Proceeding of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2013), vol. 8190, pp. 160–175 (2013)
The MNIST handwritten digit database (2017). http://yann.lecun.com/exdb/mnist/
Zhong, Z., Zheng, L., Kang, G., Li, S., Yang, Y.: Random Erasing Data Augmentation. arXiv preprint, arXiv: 1708.04896 (2017)
Zhou, D., Bousquet, O., Navin, T., Weston, J., Scholkopf, B.: Learning with local and global consistency. In: Proceeding of the 24th Advances in Neural Information Processing Systems (NIPS 2004), pp. 321–328 (2004)
Zhu, X., Ghahramani, Z.: Learning from Labeled and Unlabeled Data with Label Propagation. Carnegie Mellon University CALD Technical Reports, CMU-CALD-02-107, pp. 1–17 (2002)
Zhu, X., Ghahramani, Z.: Semi-supervised learning using gaussian fields and harmonic functions. In: Proceedings of the 21th International Conference on Machine Learning (ICML 2003), pp. 912–919 (2003)
Acknowledgement
This work was supported by JSPS KAKENHI Grant Number 18K11265. In addition, this paper is partially supported by Innovation Platform for Society 5.0 from Japan Ministry of Education, Culture, Sports, Science and Technology.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Okahana, Y., Gotoh, Y. (2020). A Study for Semi-supervised Learning with Random Erasing. In: Barolli, L., Okada, Y., Amato, F. (eds) Advances in Internet, Data and Web Technologies. EIDWT 2020. Lecture Notes on Data Engineering and Communications Technologies, vol 47. Springer, Cham. https://doi.org/10.1007/978-3-030-39746-3_49
Download citation
DOI: https://doi.org/10.1007/978-3-030-39746-3_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-39745-6
Online ISBN: 978-3-030-39746-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)