Advertisement

Compact Coding for Hyperplane Classifiers in Heterogeneous Environment

  • Hao Shao
  • Bin Tong
  • Einoshin Suzuki
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6913)

Abstract

Transfer learning techniques have witnessed a significant development in real applications where the knowledge from previous tasks are required to reduce the high cost of inquiring the labeled information for the target task. However, how to avoid negative transfer which happens due to different distributions of tasks in heterogeneous environment is still a open problem. In order to handle this kind of issue, we propose a Compact Coding method for Hyperplane Classifiers (CCHC) under a two-level framework in inductive transfer learning setting. Unlike traditional methods, we measure the similarities among tasks from the macro level perspective through minimum encoding. Particularly speaking, the degree of the similarity is represented by the relevant code length of the class boundary of each source task with respect to the target task. In addition, informative parts of the source tasks are adaptively selected in the micro level viewpoint to make the choice of the specific source task more accurate. Extensive experiments show the effectiveness of our algorithm in terms of the classification accuracy in both UCI and text data sets.

References

  1. 1.
    Argyriou, A., Evgeniou, T., Pontil, M.: Multi-Task Feature Learning. In: Proc. 19th NIPS, pp. 41–48 (2007)Google Scholar
  2. 2.
    Argyriou, A., Maurer, A., Pontil, M.: An Algorithm for Transfer Learning in a Heterogeneous Environment. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part I. LNCS (LNAI), vol. 5211, pp. 71–85. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  3. 3.
    Bakker, B., Heskes, T.: Task Clustering and Gating for Bayesian Multitask Learning. Journal of Machine Learning Research 4, 83–99 (2003)zbMATHGoogle Scholar
  4. 4.
    Cao, B., Pan, S.J., Yang, Q.: Adaptive Transfer Learning. In: AAAI 2010 (2010)Google Scholar
  5. 5.
    Chang, C.C., Lin, C.J.: LIBSVM: A Library for Support Vector Machines(2001), http://www.csie.ntu.edu.tw/~cjlin/libsvm
  6. 6.
    Shannon, C.: A Mathematical Theory of Communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)CrossRefzbMATHMathSciNetGoogle Scholar
  7. 7.
    Wallace, C., Patrick, J.: Coding Decision Trees. Machine Learning 11(1), 7–22 (1993)CrossRefzbMATHGoogle Scholar
  8. 8.
    Keogh, E., Lonardi, S., Ratanamahatana, C.A.: Towards Parameter-Free Data Mining. In: KDD 2004, pp. 206–215 (2004)Google Scholar
  9. 9.
    Zhuang, F.Z., Luo, P., Shen, Z.Y., He, Q., Xiong, Y.H., Shi, Z.Z., Xiong, H.: Collaborative Dual-PLSA: Mining Distinction and Commonality across Multiple Domains for Text Classification. In: CIKM 2010, Toronto, Canada (Octorber 2010)Google Scholar
  10. 10.
    Shao, H., Suzuki, E.: Feature-based Inductive Transfer Learning through Minimum Encoding. In: SDM 2011, Phoenix/Mesa, Arizona ( April 2011)Google Scholar
  11. 11.
    Quinlan, J.R., Rivest, R.L.: Inferring Decision Trees Using the Minimum Description Length Principle. Information and Computation 80(3), 227–248 (1989)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Borgwardt, K.M., Gretton, A.: Integrating Structured Biological Data by Kernel Maximum Mean Discrepancy. Bioniformatics, 1–9 (2006)Google Scholar
  13. 13.
    Rosenstein, M.T., Marx, Z., Kaelbling, L.P.: To Transfer or Not To Transfer. In: NIPS 2005 Workshop on Transfer Learning (2005)Google Scholar
  14. 14.
    Roy, N., McCallum, A.: Toward Optimal Active Learning through Sampling Estimation of Error Reduction. In: ICML 2001, pp. 441–448 (2001)Google Scholar
  15. 15.
    Thach, N.H., Shao, H., Tong, B., Suzuki, E.: A Compression-based Dissimilarity Measure for Multi-task Clustering. In: Proc. Nineteenth International Symposium on Methodologies for Intelligent Systems, Warsaw (June 2011)Google Scholar
  16. 16.
    Grünwald, P.D.: The Minimum Description Length Principle. MIT Press, Cambridge (2007)Google Scholar
  17. 17.
    Dhillon, P.S., Ungar, L.: Transfer Learning, Feature Selection and Word Sense Disambiguation. In: NLP/CL ACL-IJCNLP, Singapore ( August 2009)Google Scholar
  18. 18.
    Caruana, R.: Multitask Learning. Machine Learning 28(1), 41–75 (1997)CrossRefMathSciNetGoogle Scholar
  19. 19.
    David, S.B., Schuller, R.: Exploiting Task Relatedness for Multiple Task Learning. In: COLT 2003, pp. 825–830 (2003)Google Scholar
  20. 20.
    Pan, S.J., Yang, Q.: A Survey on Transfer Learning. IEEE Transactions on Knowledge and Data Engineering 22(10), 1345–1359 (2008)CrossRefGoogle Scholar
  21. 21.
    Rückert, U., Kramer, S.: Kernel-based Inductive Transfer. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part II. LNCS (LNAI), vol. 5212, pp. 220–233. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  22. 22.
    Dai, W., Yang, Q., Xue, G.R., Yu, Y.: Boosting for Transfer Learning. In: ICML 2007, pp. 193–200 (2007)Google Scholar
  23. 23.
    Shi, X., Fan, W., Ren, J.: Actively Transfer Domain Knowledge. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part II. LNCS (LNAI), vol. 5212, pp. 342–357. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  24. 24.
    Shi, X., Fan, W., Yang, Q., Ren, J.: Relaxed Transfer of Different Classes via Spectral Partition. In: ECML/PKDD, vol. (2), pp. 366–381 (2009)Google Scholar
  25. 25.
    Ke, Y.: Inferring Informed Clustering Problems with Minimum Description Length principle. Doctoral dissertation, State University of New York at Albany (2007)Google Scholar
  26. 26.
    Shi, Y., Lan, Z.Z., Liu, W., Bi, W.: Extended Semi-supervised Learning Methods for Inductive Transfer Learning. In: ICDM 2009, pp. 483–492 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Hao Shao
    • 1
  • Bin Tong
    • 1
  • Einoshin Suzuki
    • 2
  1. 1.Graduate School of Systems Life SciencesKyushu UniversityJapan
  2. 2.Department of Informatics, ISEEKyushu UniversityJapan

Personalised recommendations