Advertisement

Knowledge Graph Embedding via Relation Paths and Dynamic Mapping Matrix

  • Shengwu Xiong
  • Weitao Huang
  • Pengfei Duan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11158)

Abstract

Knowledge graph embedding aims to embed both entities and relations into a low-dimensional space. Most existing methods of representation learning consider direct relations and some of them consider multiple-step relation paths. Although those methods achieve state-of-the-art performance, they are far from complete. In this paper, a noval path-augmented TransD (PTransD) model is proposed to improve the accuracy of knowledge graph embedding. This model uses two vectors to represent entities and relations. One of them represents the meaning of a(n) entity (relation), the other one is used to construct the dynamic mapping matrix. The PTransD model considers relation paths as translation between entities for representation learning. Experimental results on public dataset show that PTransD achieves significant and consistent improvements on knowledge graph completion.

Keywords

Representation learning Knowledge graph Dynamic mapping matrix Relation path 

Notes

Acknowledgments

This work was partially supported by National Key R&D Program of China (No. 2016YFD0101903), National Natural Science Foundation of China (No. 61702386, 61672398), Major Technical Innovation Program of Hubei Province (No. 2017AAA122), Key Natural Science Foundation of Hubei Province of China (No. 2017CFA012), Applied Fundamental Research of Wuhan (No. 20160101010004), Fundamental Research Funds for the Central Universities (WUT:2018IVB047) and Excellent Dissertation Cultivation Funds of Wuhan University of Technology (2017-YS-061).

References

  1. 1.
    Lin, Y., Liu, Z., Luan, H., Sun, M., Rao, S., Liu, S.: Modeling relation paths for representation learning of knowledge graphs. In: The Conference on Empirical Methods in Natural Language Processing (EMNLP 2015) (2015)Google Scholar
  2. 2.
    Bordes, A., Glorot, X., Weston, J., et al.: Joint learning of words and meaning representations for open-text semantic parsing. In: AISTATS 2012, vol. 22, pp. 127–135 (2012)Google Scholar
  3. 3.
    Bordes, A., Usunier, N., Garcia-Duran, A., et al.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, pp. 2787–2795 (2013)Google Scholar
  4. 4.
    Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRefGoogle Scholar
  5. 5.
    Xie, R., Liu, Z., Jia, J., et al.: Representation learning of knowledge graphs with entity descriptions. In: AAAI 2016, pp. 2659–2665 (2016)Google Scholar
  6. 6.
    Bordes, A., Weston, J., Collobert, R., et al.: Learning structured embeddings of knowledge graphs. In: Conference on Artificial Intelligence (2011). (EPFL-CONF-192344)Google Scholar
  7. 7.
    Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: The 29th AAAI Conference on Artificial Intelligence (AAAI 2015) (2015)Google Scholar
  8. 8.
    Yang, B., Yih, W., He, X., et al.: Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575 (2014)
  9. 9.
    Ji, G., He, S., Xu, L., et al.: Knowledge graph embedding via dynamic mapping matrix. In: ACL, vol. 1, pp. 687–696 (2015)Google Scholar
  10. 10.
    Mikolov, T., Chen, K., Corrado, G., et al.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
  11. 11.
    Yang, C., Liu, Z., Zhao, D., et al.: Network representation learning with rich text information. In: IJCAI 2015, pp. 2111–2117 (2015)Google Scholar
  12. 12.
    Bottou, L.: Large-scale machine learning with stochastic gradient descent. In: Lechevallier, Y., Saporta, G. (eds.) Proceedings of COMPSTAT’2010, pp. 177–186. Physica-Verlag, Heidelberg (2010).  https://doi.org/10.1007/978-3-7908-2604-3_16CrossRefGoogle Scholar
  13. 13.
    Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of AAAI, pp. 1112–1119 (2014)Google Scholar
  14. 14.
    Fan, M., Zhou, Q., Chang, E., et al.: Transition-based knowledge graph embedding with relational mapping properties. In: PACLIC 2014, pp. 328–337 (2014)Google Scholar
  15. 15.
    Egger, P., Pfaffermayr, M.: The proper panel econometric specification of the gravity equation: a three-way model with bilateral interaction effects. Empir. Econ. 28(3), 571–580 (2003)CrossRefGoogle Scholar
  16. 16.
    Jenatton, R., Roux, N.L., Bordes, A., et al.: A latent factor model for highly multi-relational data. In: Advances in Neural Information Processing Systems, pp. 3167–3175 (2012)Google Scholar
  17. 17.
    Duan, P., Wang, Y., Xiong, S., Mao, J.: Space projection and relation path based representation learning for construction of geography knowledge graph. In: China Conference on Knowledge Graph and Semantic Computing, CCKS 2016 (2016)Google Scholar
  18. 18.
    Bollacker, K., Evans, C., Paritosh, P., et al.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250. ACM (2008)Google Scholar
  19. 19.
    Miller, G.A.: WordNet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)CrossRefGoogle Scholar
  20. 20.
    Suchanek, F.M., Kasneci, G., Weikum, G.: Yago: a core of semantic knowledge. In: Proceedings of the 16th International Conference on World Wide Web, pp. 697–706. ACM (2007)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.School of Computer Science and TechnologyWuhan University of TechnologyWuhanChina
  2. 2.Hubei Key Laboratory of Transportation Internet of ThingsWuhanChina

Personalised recommendations