Advertisement

Entity Hyponymy Extraction of Complex Sentence Combining Bootstrapping and At-BiLSTM in Special Domain

  • Huaqin Li
  • Zhiju Zhang
  • Zhengtao Yu
  • Hongbin Wang
  • Hua Lai
Conference paper
  • 9 Downloads
Part of the Studies in Distributed Intelligence book series (SDI)

Abstract

Acquiring entity hyponymy of complex sentences can be a highly difficult process in special domain. To tackle this problem, this paper proposes a novel method that combines Bootstrapping method and Attention-Based Bidirectional Long Short-Term Memory Networks (Bo-At-BiLSTM). The experimental corpus is in the field of tourism in China. First, the bootstrapping method is used to obtain the patterns set. Then, pattern matching is employed to acquire the candidate sentences and word embedding. Next, import into the bidirectional Long Short-Term Memory Networks and introduce attention mechanism. Finally, output the results by Softmax classifier. The experimental results on the tourism corpus show that the proposed approach outperforms the baseline methods.

Keywords

Hyponymy extraction Complex sentences Bootstrapping method At-BiLSTM 

Notes

Acknowledgments

This paper is supported by National key research and development plan project (Grant Nos. 2018YFC0830105, 2018YFC0830100), National Natural Science Foundation of China (Grant Nos. 61732005, 61672271, 61761026, and 61762056), Yunnan high-tech industry development project (Grant No. 201606), Natural Science Foundation of Yunnan Province (Grant No. 2018FB104), and National Natural Science Foundation of China (Grant Nos. 61562052, 61462054, and 61866019).

References

  1. 1.
    X. Yuan, D. Ang, A novel figure panel classification and extraction method for document image understanding. Int. J. Data Min. Bioinform. 9(1), 22–36 (2014)CrossRefGoogle Scholar
  2. 2.
    G.A. Miller, WordNet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)CrossRefGoogle Scholar
  3. 3.
    M.A. Hearst, Automatic acquisition of hyponyms from large text corpora, in Proceedings of the 14th Conference on Computational Linguistics-Volume 2, (Association for Computational Linguistics, Stroudsburg, 1992), pp. 539–545CrossRefGoogle Scholar
  4. 4.
    S. Cederberg, W. Dominic, Using Isa and noun coordination information to improve the precision and recall of automatic hyponymy extraction. in Proceedings of the Seventh Conference on Natural Language Learning at Human Language Technology & North American Chapter of the Association for Computational Linguistics, pp. 111–118, 2003Google Scholar
  5. 5.
    R. Snow, D. Jurafsky, A.Y. Ng, Learning syntactic patterns for automatic hypernym discovery. Adv. Neural Inf. Proces. Syst. 17, 1297–1304 (2004)Google Scholar
  6. 6.
    J. Wu, C.C. Robert, et al., Acquisition and validation of some relational knowledge in web pages. J. East China Univ. Technol. 8, 11 (2006)Google Scholar
  7. 7.
    L. Liu, C.G. Cao, H.T. Wgng, et al., A method of hyponym acquisition based on ‘isa’ pattern. Comput. Sci. 33(9), 146–151 (2006)Google Scholar
  8. 8.
    F.M. Suchanek, G. Kasneci, G. Weikum, Yago: a large ontology from wikipedia and WordNet. J. Web Semant. 6(3), 203–217 (2008)CrossRefGoogle Scholar
  9. 9.
    N. Nakaya, M. Kurematsu, T. Yamaguchi, A domain ontology development environment using a MRD and text corpus, in Proc of the Joint Conf on Knowledge Based Software Engineering, (IOS Press, Amsterdam, 2002), pp. 242–253Google Scholar
  10. 10.
    T. Lin, O. Etzioni, No noun phrase left behind: detecting and typing unlink able entities. in Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 893–903, 2012Google Scholar
  11. 11.
    Q.H. Fan, H.Y. Zan, Y.M. Chai, et al., Hyponym discovery of multiple resource fusion. Comput. Eng. Des. 34(12), 4310–4315 (2013)Google Scholar
  12. 12.
    G. Boella, L.D. Caro, Extraction definitions and hypernym relations relying on syntactic dependencies and support vector machines. in Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, pp. 532–537, 2013Google Scholar
  13. 13.
    Y. Huang, Q. Wang, Y. Liu, An acquisition method of domain-specific terminological hyponymy based on CRF. J. Cent. South Univ. 44(2), 355–359 (2013)Google Scholar
  14. 14.
    C. Wang, Z. Yang, A method of acquiring domain terms based on sentence structure features. J. Chongqing Univ. Posts Telecommun. 26(3), 385–389 (2014)Google Scholar
  15. 15.
    Y. Cheng, J. Guo, Y. Xian, A hybrid method for entity hyponymy acquisition in Chinese complex sentences. Autom. Control. Comput. Sci. 50(5), 369–377 (2016)CrossRefGoogle Scholar
  16. 16.
    D. Zhang, D. Wang, Relation classification via recurrent neural network. arXiv preprint arXiv:1508. 01006, 2015Google Scholar
  17. 17.
    J. Samuel, X. Yuan, X. Yuan, B. Walton, Mining online full-text literature for novel protein interaction discovery. in 2010 IEEE International Conference on Bioinformatics and Biomedicine, pp. 277–282, 2010Google Scholar
  18. 18.
    P. Zhou, W. Shi, J. Tian, et al., Attention-based bidirectional long short-term memory networks for relation classification. in Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, August 7–12, pp. 207–212, 2016Google Scholar
  19. 19.
    T. Mikolov, I. Sutskever, C. Kai, et al., Distributed representations of words and phrases and their compositionality. 2014. https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf
  20. 20.
    J.K. Chorowski, D. Bahdanau, D. Serdyuk, K. Cho, Y. Bengio, Attention-based models for speech recognition. in Advances in Neural Information Processing Systems, pp. 577–585, 2015Google Scholar
  21. 21.
    L. Lu, Y. Wu, Y. Wang, et al., Long text categorization combined with attention mechanism. Comput. Appl. 38(5), 1272–1277 (2018)Google Scholar
  22. 22.
    C.Y. Lei, J.Y. Guo, Z.T. Yu, et al., The field of automatic entity relation extraction based on binary classifier and reasoning, in Third International Symposium on Information Processing (IEEE, Piscataway, 2010)Google Scholar
  23. 23.
    M.T. Luong, H. Pham, C.D. Manning, Effective approaches to attention-based neural machine translation. Comput. Therm. Sci. 2015, 11 (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Huaqin Li
    • 1
  • Zhiju Zhang
    • 1
  • Zhengtao Yu
    • 1
  • Hongbin Wang
    • 1
  • Hua Lai
    • 1
  1. 1.School of Information Engineering and AutomationKunming University of Science and TechnologyKunmingChina

Personalised recommendations