Abstract
Training of deep learning algorithms such as CNN, LSTM, or GRU often requires large amount of data. However in real world applications, the amount of data especially labelled data is limited. To address this challenge, we study Deep Transfer Learning (DTL) in the context of Multitasking Learning (MTL) to extract sharable knowledge from tasks and use it for related tasks. In this paper, we use Minimum Closed Ball (MEB) as a flexible knowledge representation method to map shared domain knowledge from primary task to secondary task in multitasking learning. The experiments provide both analytic and empirical results to show the effectiveness and robustness of the proposed MEB-based deep transfer learning.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Badoiu, M.: Optimal core sets for balls. In: DIMACS Workshop on Computational Geometry (2002)
Chapelle, O., Scholkopf, B., Zien, A.: Semi-supervised Learning. MIT Press, Cambridge (2006)
Cho, K., van Merrienboer, B., Gülçehre, Ç., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1724–1734 (2014)
Donahue, J., Jia, Y., Vinyals, O., Hoffman, J., Zhang, N., Tzeng, E., Darrell, T.: Decaf: A deep convolutional activation feature for generic visual recognition. In: Proceedings of the 31st International Conference on Machine Learning (ICML), pp. 647–655 (2014)
Hochreiter, S., Schmidhuber, J.: Learning to forget: continual prediction with lstm. Neural Comput. 12(10), 2451–2471 (2000)
Jure, Z., Yann, L.: Stereo matching by training a convolutional neural network to compare image patches. J. Mach. Learn. Res. 17, 65:1–65:32 (2016)
Kumar, P., Mitchell, J.S.B., Yildirim, E.A.: Approximate minimum enclosing balls in high dimensions using core-sets. J. Exp. Algorithmics (JEA) 8, 1–1 (2003)
Liu, F.: Minimum Enclosing Ball-Based Learner Independent Knowledge Transfer for Correlated Multi-task Learning. Master’s thesis, School of Computing and Mathematical Sciences, Auckland University of Technology, September 2010
Mou, L., Meng, Z., Yan, R., Li, G., Xu, Y., Zhang, L., Jin, Z.: How transferable are neural networks in NLP applications? In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 479–489 (2016)
Pang, S., Liu, F., Kadobayashi, Y., Ban, T., Inoue, D.: Training minimum enclosing balls for cross tasks knowledge transfer. In: ICONIP, pp. 375–382, November 2012
Pang, S., Liu, F., Kadobayashi, Y., Ban, T., Inoue, D.: A learner-independent knowledge transfer approach to multi-task learning. Cogn. Comput. 6(3), 304–320 (2014)
Silver, D.L., Mercer, R.E.: Selective functional transfer: Inductive bias from related tasks. In: IASTED International Conference on Artificial Intelligence and Soft Computing (ASC2001), pp. 182–189 (2002)
Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core vector machines: fast SVM training on very large data sets. J. Mach. Learn. Res. 6, 363–392 (2005)
LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998)
Yosinski, J., Clune, J., Bengio, Y., Lipson, H.: How transferable are features in deep neural networks? In: NIPS, pp. 3320–3328, December 2014
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Deng, Z., Liu, F., Zhao, J., Wei, Q., Pang, S., Leng, Y. (2018). Deep Transfer Learning via Minimum Enclosing Balls. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11303. Springer, Cham. https://doi.org/10.1007/978-3-030-04182-3_18
Download citation
DOI: https://doi.org/10.1007/978-3-030-04182-3_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04181-6
Online ISBN: 978-3-030-04182-3
eBook Packages: Computer ScienceComputer Science (R0)