Skip to main content

Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2016)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9948))

Included in the following conference series:

Abstract

Kernelized learning algorithms have seen a steady growth in popularity during the last decades. The procedure to estimate the performances of these kernels in real applications is typical computationally demanding due to the process of hyper-parameter selection. This is especially true for graph kernels, which are computationally quite expensive. In this paper, we study an approach that substitutes the commonly adopted procedure for kernel hyper-parameter selection by a multiple kernel learning procedure that learns a linear combination of kernel matrices obtained by the same kernel with different values for the hyper-parameters. Empirical results on real-world graph datasets show that the proposed methodology is faster than the baseline method when the number of parameter configurations is large, while always maintaining comparable and in some cases superior performances.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A graph where vertices are atoms and edges are chemical bonds; the label attached to each vertex reports the atom type.

  2. 2.

    http://cheminformatics.org/datasets/bursi/.

References

  1. Aiolli, F., Da San Martino, G., Sperduti, A.: A kernel method for the optimization of the margin distribution. In: Kůrková, V., Neruda, R., Koutník, J. (eds.) ICANN 2008, Part I. LNCS, vol. 5163, pp. 305–314. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  2. Aiolli, F., Donini, M.: EasyMKL: a scalable multiple kernel learning algorithm. Neurocomputing 169, 215–224 (2015)

    Article  Google Scholar 

  3. Aiolli, F., Donini, M., Navarin, N., Sperduti, A.: Multiple graph-kernel learning. In: IEEE SSCI, Cape Town, pp. 1607–1614. IEEE (2015)

    Google Scholar 

  4. Da San Martino, G., Navarin, N., Sperduti, A.: A tree-based kernel for graphs. In: SDM, pp. 975–986 (2012)

    Google Scholar 

  5. Da San Martino, G., Navarin, N., Sperduti, A.: Ordered decompositional, DAG kernel enhancements. Neurocomputing 192, 92–103 (2016)

    Article  Google Scholar 

  6. Da San Martino, G., Sperduti, A.: Mining structured data. IEEE Comput. Intell. Mag. 5(1), 42–49 (2010)

    Article  Google Scholar 

  7. Dobson, P.D., Doig, A.J.: Distinguishing enzyme structures from non-enzymes without alignments. J. Mol. Biol. 330(4), 771–783 (2003)

    Article  Google Scholar 

  8. Gnen, M., Alpaydin, E.: Multiple kernel learning algorithms. JMLR 12, 2211–2268 (2011)

    MathSciNet  MATH  Google Scholar 

  9. Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. JMLR 5(2), 1391–1415 (2004)

    MathSciNet  MATH  Google Scholar 

  10. Helma, C., Cramer, T., Kramer, S., Raedt, L.D.: Data mining and machine learning techniques for the identification of mutagenicity inducing substructures and structure activity relationships of noncongeneric compounds. J. Chem. Inf. Model. 44(4), 1402–1411 (2004)

    Google Scholar 

  11. Navarin, N., Sperduti, A., Tesselli, R.: Extending local features with contextual information in graph kernels. In: Arik, S., Huang, T., Lai, W.K., Liu, Q. (eds.) ICONIP 2015. LNCS, vol. 9492, pp. 271–279. Springer, Heidelberg (2015). doi:10.1007/978-3-319-26561-2_33

    Chapter  Google Scholar 

  12. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)

    Book  MATH  Google Scholar 

  13. Shervashidze, N., Borgwardt, K.M.: Fast subtree kernels on graphs. In: Bengio, Y., Schuurmans, D., Lafferty, J.D., Williams, C.K.I., Culotta, A. (eds.) NIPS, pp. 1660–1668. Curran Associates Inc., Red Hook (2009)

    Google Scholar 

  14. Tesselli, R.: Adding contextual information to graph kernels. Master’s thesis, Università di Padova (2015)

    Google Scholar 

  15. Wale, N., Watson, I.A., Karypis, G.: Comparison of descriptor spaces for chemical compound retrieval and classification. Knowl. Inf. Syst. 14(3), 347–375 (2008)

    Article  Google Scholar 

  16. Weislow, O.S., Kiser, R., Fine, D.L., Bader, J., Shoemaker, R.H., Boyd, M.R.: New soluble-formazan assay for HIV-1 cytopathic effects: application to high-flux screening of synthetic and natural products for AIDS-antiviral activity. J. Natl. Cancer Inst. 81(8), 577–586 (1989)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the University of Padova under the strategic project BIOINFOGEN.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicolò Navarin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Massimo, C.M., Navarin, N., Sperduti, A. (2016). Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9948. Springer, Cham. https://doi.org/10.1007/978-3-319-46672-9_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46672-9_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46671-2

  • Online ISBN: 978-3-319-46672-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics