Advertisement

Incorporating Term Definitions for Taxonomic Relation Identification

  • Yongpan Sheng
  • Tianxing Wu
  • Xin WangEmail author
Conference paper
  • 32 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12032)

Abstract

Taxonomic relations (also called “is-A” relations) are key components in taxonomies, semantic hierarchies and knowledge graphs. Previous works on identifying taxonomic relations are mostly based on linguistic and distributional approaches. However, these approaches are limited by the availability of a large enough corpus that can cover all terms of interest and provide sufficient contextual information to represent their meanings. Therefore, the generalization abilities of the approaches are far from satisfactory. In this paper, we propose a novel neural network model to enhance the semantic representations of term pairs by encoding their respective definitions for the purpose of taxonomic relation identification. This has two main benefits: (i) Definitional sentences represent specified corpus-independent meanings of terms, hence definition-driven approaches have a great generalization capability to identify unseen terms and taxonomic relations which are not expressed in domain specificity of the training data; (ii) Global contextual information from a large corpus and definitions in the sense level can provide richer interpretation of terms from a broader knowledge base perspective, and benefit the accurate prediction for the taxonomic relations of term pairs. The experimental results show that our model outperforms several competitive baseline methods in terms of F-score on both specific and open domain datasets.

Keywords

Taxonomic relation identification Definition-driven approach 

Notes

Acknowledgments

This work is supported by the National Natural Science Foundation of China (61572353, 61402323), the National High-tech R&D Program of China (863 Program) (2013AA013204), and the Natural Science Foundation of Tianjin (17JCYBJC15400).

References

  1. 1.
    Anh, T.L., Tay, Y., Hui, S.C., Ng, S.K.: Learning term embeddings for taxonomic relation identification using dynamic weighting neural network. In: EMNLP, pp. 403–413 (2016)Google Scholar
  2. 2.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR (2015)Google Scholar
  3. 3.
    Baroni, M., Lenci, A.: How we blessed distributional semantic evaluation. In: Proceedings of the GEMS 2011 Workshop on GEometrical Models of Natural Language Semantics, pp. 1–10 (2011)Google Scholar
  4. 4.
    Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)zbMATHGoogle Scholar
  5. 5.
    Fu, R., Guo, J., Qin, B., Che, W., Wang, H., Liu, T.: Learning semantic hierarchies via word embeddings. In: ACL (Volume 1: Long Papers), pp. 1199–1209 (2014)Google Scholar
  6. 6.
    Graves, A., Schmidhuber, J.: Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 18(5–6), 602–610 (2005)CrossRefGoogle Scholar
  7. 7.
    Harabagiu, S.M., Maiorano, S.J., Paşca, M.A.: Open-domain textual question answering techniques. Nat. Lang. Eng. 9(3), 231–267 (2003)CrossRefGoogle Scholar
  8. 8.
    Harris, Z.S.: Distributional structure. Word 10(2–3), 146–162 (1954)CrossRefGoogle Scholar
  9. 9.
    Hearst, M.A.: Automatic acquisition of hyponyms from large text corpora. In: COLING, pp. 539–545. Association for Computational Linguistics (1992)Google Scholar
  10. 10.
    Kiela, D., Rimell, L., Vulić, I., Clark, S.: Exploiting image generality for lexical entailment detection. In: ACL-IJCNLP (Volume 2: Short Papers), pp. 119–124 (2015)Google Scholar
  11. 11.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (2015)Google Scholar
  12. 12.
    Kotlerman, L., Dagan, I., Szpektor, I., Zhitomirsky-Geffet, M.: Directional distributional similarity for lexical inference. Nat. Lang. Eng. 16(4), 359–389 (2010)CrossRefGoogle Scholar
  13. 13.
    Levy, O., Remus, S., Biemann, C., Dagan, I.: Do supervised distributional methods really learn lexical inference relations? In: NAACL, pp. 970–976 (2015)Google Scholar
  14. 14.
    Liu, Y., Sun, C., Lin, L., Wang, X.: Learning natural language inference using bidirectional LSTM model and inner-attention (2016). https://arxiv.org/abs/1605.09090
  15. 15.
    Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR (Workshop Poster) (2013)Google Scholar
  16. 16.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS, pp. 3111–3119 (2013)Google Scholar
  17. 17.
    Miller, G.A.: WordNet: An Electronic Lexical Database. MIT Press, Cambridge (1998)zbMATHGoogle Scholar
  18. 18.
    Mou, L., et al.: Natural language inference by tree-based convolution and heuristic matching. In: ACL (2014)Google Scholar
  19. 19.
    Nakashole, N., Weikum, G., Suchanek, F.: Patty: a taxonomy of relational patterns with semantic types. In: EMNLP-CoNLL, pp. 1135–1145 (2012)Google Scholar
  20. 20.
    Navigli, R., Velardi, P., Faralli, S.: A graph-based algorithm for inducing lexical taxonomies from scratch. In: IJCAI, pp. 1872–1877 (2011)Google Scholar
  21. 21.
    Neculoiu, P., Versteegh, M., Rotaru, M.: Learning text similarity with Siamese recurrent networks. In: Proceedings of the 1st Workshop on Representation Learning for NLP, pp. 148–157 (2016)Google Scholar
  22. 22.
    Santus, E., Lenci, A., Lu, Q., Schulte im Walde, S.: Chasing hypernyms in vector spaces with entropy. In: EACL, pp. 38–42 (2014)Google Scholar
  23. 23.
    Shwartz, V., Goldberg, Y., Dagan, I.: Improving hypernymy detection with an integrated path-based and distributional method. In: ACL, pp. 2389–2398 (2016)Google Scholar
  24. 24.
    Shwartz, V., Levy, O., Dagan, I., Goldberger, J.: Learning to exploit structured resources for lexical inference. In: CoNLL, pp. 175–184 (2015)Google Scholar
  25. 25.
    Shwartz, V., Santus, E., Schlechtweg, D.: Hypernyms under Siege: linguistically-motivated artillery for hypernymy detection. In: EACL, pp. 65–75 (2017)Google Scholar
  26. 26.
    Snow, R., Jurafsky, D., Ng, A.Y.: Learning syntactic patterns for automatic hypernym discovery. In: NIPS, pp. 1297–1304 (2004)Google Scholar
  27. 27.
    Wong, M.K., Abidi, S.S.R., Jonsen, I.D.: A multi-phase correlation search framework for mining non-taxonomic relations from unstructured text. Knowl. Inf. Syst. 38(3), 641–667 (2014)CrossRefGoogle Scholar
  28. 28.
    Wu, W., Li, H., Wang, H., Zhu, K.Q.: Probase: a probabilistic taxonomy for text understanding. In: SIGMOD, pp. 481–492 (2012)Google Scholar
  29. 29.
    Yu, Z., Wang, H., Lin, X., Wang, M.: Learning term embeddings for hypernymy identification. In: IJCAI, pp. 1390–1397 (2015)Google Scholar
  30. 30.
    Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J., et al.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduChina
  2. 2.School of Computer Science and EngineeringNanyang Technological UniversitySingaporeSingapore
  3. 3.College of Intelligence and ComputingTianjin UniversityTianjinChina

Personalised recommendations