Skip to main content

Multiple Kernel Fusion with HSIC Lasso

  • Conference paper
  • First Online:
PRICAI 2018: Trends in Artificial Intelligence (PRICAI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11012))

Included in the following conference series:

  • 3374 Accesses

Abstract

Multiple kernel learning (MKL) is a principled way for kernel fusion for various learning tasks, such as classification, clustering and dimensionality reduction. In this paper, we develop a novel multiple kernel learning model based on the Hilbert-Schmidt independence criterion (HSIC) for classification (called HSIC-MKL). In the proposed HSIC-MKL model, we first propose a HSIC Lasso-based MKL formulation, which not only has a clear statistical interpretation that minimum redundant kernels with maximum dependence on output labels are found and combined, but also the global optimal solution can be computed efficiently by solving a Lasso optimization problem. After the optimal kernel is obtained, the support vector machine (SVM) is used to select the prediction hypothesis. It is evident that the proposed HSIC-MKL is a two-stage kernel learning approach. Extensive experiments on real-world data sets from UCI benchmark repository validate the superiority of the proposed model in terms of prediction accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In statistics and machine learning, Lasso (least absolute shrinkage and selection operator) (also LASSO) is a regression analysis method that performs both variable selection and regularization in order to enhance the prediction accuracy and interpretability of the statistical model it produces.

  2. 2.

    http://asi.insa-rouen.fr/enseignants/~arakoto/toolbox/.

  3. 3.

    http://asi.insa-rouen.fr/enseignants/~arakoto/code/mklindex.html.

References

  1. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)

    Book  Google Scholar 

  2. Wang, T., Zhao, D., Tian, S.: An overview of kernel alignment and its applications. Artif. Intell. Rev. 43(2), 179–192 (2015)

    Article  Google Scholar 

  3. Gönen, M., Alpayın, E.: Multiple kernel learning algorithms. J. Mach. Learn. Res. 12, 2211–2268 (2011)

    MathSciNet  MATH  Google Scholar 

  4. Bucak, S.S., Jin, R., Jain, A.K.: Multiple kernel learning for visual object recognition: a review. IEEE Trans. Pattern Anal. Mach. Intell. 36(7), 1354–1369 (2014)

    Article  Google Scholar 

  5. Gu, Y., Chanussot, J., Jia, X., Benediktsson, J.A.: Multiple kernel learning for hyperspectral image classification: a review. IEEE Trans. Geosci. Remote Sens. 55(11), 6547–6565 (2017)

    Article  Google Scholar 

  6. Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.: Measuring statistical dependence with Hilbert-Schmidt norms. In: Jain, S., Simon, H.U., Tomita, E. (eds.) ALT 2005. LNCS (LNAI), vol. 3734, pp. 63–77. Springer, Heidelberg (2005). https://doi.org/10.1007/11564089_7

    Chapter  Google Scholar 

  7. Wang, T., Li, W.: Kernel learning and optimization with Hilbert-Schmidt independence criterion. Int. J. Mach. Learn. Cybern. 1–11 (2017). https://doi.org/10.1007/s13042-017-0675-7

  8. Yamada, M., Kimura, A., Naya, F., Sawada, H.: Change-point detection with feature selection in high-dimensional time-series data. In Proceedings of the 23rd International Joint Conference on Artificial Intelligence, Beijing, China, pp. 1827–1833 (2013)

    Google Scholar 

  9. Yamada, M., Jitkrittum, W., Sigal, L., Xing, E.P., Sugiyama, M.: High-dimensional feature selection by feature-wise kernelized Lasso. Neural Comput. 26(1), 185–207 (2014)

    Article  MathSciNet  Google Scholar 

  10. Rakotomamonjy, A., Bach, F.R., Canu, S., Grandvalet, Y.: SimpleMKL. J. Mach. Learn. Res. 9, 2491–2521 (2008)

    MathSciNet  MATH  Google Scholar 

  11. Kloft, M., Brefeld, U., Sonnenburg, S., Zien, A.: lp-norm multiple kernel learning. J. Mach. Learn. Res. 12, 953–997 (2011)

    MathSciNet  MATH  Google Scholar 

  12. Tomioka, R., Sugiyama, M.: Dual-augmented Lagrangian method for efficient sparse reconstruction. IEEE Sig. Process. Lett. 16(12), 1067–1070 (2009)

    Article  Google Scholar 

  13. Tomioka, R., Sugiyama, M.: Super-linear convergence of dual augmented Lagrangian algorithm for sparsity regularized estimation. J. Mach. Learn. Res. 12, 1537–1586 (2011)

    MathSciNet  MATH  Google Scholar 

  14. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods: Support Vector Learning, pp. 185–208 (1999)

    Google Scholar 

  15. Cortes, C., Mohri, M., Rostamizadeh, A.: Algorithms for learning kernels based on centered alignment. J. Mach. Learn. Res. 13, 795–828 (2012)

    MathSciNet  MATH  Google Scholar 

  16. Lichman, M.: UCI machine learning repository. University of California, School of Information and Computer Science, Irvine (2013). http://archive.ics.uci.edu/ml/

  17. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work is supported in part by the National Natural Science Foundation of China (No. 61562003).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tinghua Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, T., Liu, F. (2018). Multiple Kernel Fusion with HSIC Lasso. In: Geng, X., Kang, BH. (eds) PRICAI 2018: Trends in Artificial Intelligence. PRICAI 2018. Lecture Notes in Computer Science(), vol 11012. Springer, Cham. https://doi.org/10.1007/978-3-319-97304-3_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97304-3_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97303-6

  • Online ISBN: 978-3-319-97304-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics