A Jensen-Shannon Divergence Kernel for Directed Graphs
Recently, kernel methods have been widely employed to solve machine learning problems such as classification and clustering. Although there are many existing graph kernel methods for comparing patterns represented by undirected graphs, the corresponding methods for directed structures are less developed. In this paper, to fill this gap in the literature we exploit the graph kernels and graph complexity measures, and present an information theoretic kernel method for assessing the similarity between a pair of directed graphs. In particular, we show how the Jensen-Shannon divergence, which is a mutual information measure that gauges the difference between probability distributions, together with the recently developed directed graph von Neumann entropy, can be used to compute the graph kernel. In the experiments, we show that our kernel method provides an efficient tool for classifying directed graphs with different structures and analyzing real-world complex data.
KeywordsInformation theoretic kernel Jensen-Shannon divergence von Neumann entropy
- 6.Kashima, H., Tsuda, K., Inokuchi, A.: Marginalized kernels between labeled graphs. In: Proceedings in 20th International Conference on Machine Learning, pp. 321–328(2003)Google Scholar
- 7.Borgwardt, K.M., Kriegel, H.P.: Shortest-path kernels on graphs. In: Proceedings in the 5th IEEE International Conference on Data Mining, pp. 74–81 (2005)Google Scholar
- 14.Battiston, S., Caldarelli, G.: Systemic risk in financial networks. J. Financ. Manag. Mark. Inst. 1, 129–154 (2013)Google Scholar