Nested relation extraction with iterative neural network

Abstract

Most existing researches on relation extraction focus on binary flat relations like BornIn relation between a Person and a Location. But a large portion of objective facts described in natural language are complex, especially in professional documents in fields such as finance and biomedicine that require precise expressions. For example, “the GDP of the United States in 2018 grew 2.9% compared with 2017” describes a growth rate relation between two other relations about the economic index, which is beyond the expressive power of binary flat relations. Thus, we propose the nested relation extraction problem and formulate it as a directed acyclic graph (DAG) structure extraction problem. Then, we propose a solution using the Iterative Neural Network which extracts relations layer by layer. The proposed solution achieves 78.98 and 97.89 F1 scores on two nested relation extraction tasks, namely semantic cause-and-effect relation extraction and formula extraction. Furthermore, we observe that nested relations are usually expressed in long sentences where entities are mentioned repetitively, which makes the annotation difficult and error-prone. Hence, we extend our model to incorporate a mention-insensitive mode that only requires annotations of relations on entity concepts (instead of exact mentions) while preserving most of its performance. Our mention-insensitive model performs better than the mention sensitive model when the random level in mention selection is higher than 0.3.

This is a preview of subscription content, access via your institution.

References

  1. 1.

    Ernst P, Siu A, Weikum G. Highlife: higher-arity fact harvesting. In: Proceedings of the 2018 World Wide Web Conference. 2018, 1013–1022

  2. 2.

    Hassan N, Arslan F, Li C, Tremayne M. Toward automated fact-checking: detecting check-worthy factual claims by claimbuster. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2017, 1803–1812

  3. 3.

    Mintz M, Bills S, Snow R, Dan J. Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP. 2009, 1003–1011

  4. 4.

    Aggarwal C C, Zhai C. Mining Text Data. Springer Science & Business Media, 2012

  5. 5.

    Miwa M, Bansal M. End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016, 1105–1116

  6. 6.

    Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z. Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015, 1785–1794

  7. 7.

    Zhou P, Shi W, Tian J, Qi Z, Li B, Hao H, Xu B. Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016, 207–212

  8. 8.

    Zhang Y, Qi P, Manning C D. Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018, 2205–2215

  9. 9.

    Katiyar A, Cardie C. Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017, 917–928

  10. 10.

    Christopoulou F, Miwa M, Ananiadou S. A walk-based model on entity graphs for relation extraction. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018, 81–88

  11. 11.

    Zeng W, Lin Y, Liu Z, Sun M. Incorporating relation paths in neural relation extraction. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017, 1768–1777

  12. 12.

    Suchanek F M, Kasneci G, Weikum G. Yago: a large ontology from wikipedia and wordnet. Journal of Web Semantics, 2008, 6(3): 203–217

    Article  Google Scholar 

  13. 13.

    Zhou D, Zhong D, He Y. Biomedical relation extraction: from binary to complex. Computational and Mathematical Methods in Medicine, 2014

  14. 14.

    McDonald R, Pereira F, Kulick S, Winters S, Jin Y, White P. Simple algorithms for complex relation extraction with applications to biomedical IE. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics. 2005, 491–498

  15. 15.

    Li J, Sun Y, Johnson R J, Sciaky D, Wei C H, Leaman R, Davis A P, Mattingly C J, Wiegers T C, Lu Z. BioCreative V CDR task corpus: a resource for chemical disease relation extraction. Database: the Journal of Biological Databases & Curation, 2016, 2016: baw068

    Article  Google Scholar 

  16. 16.

    Peng Y, Wei C H, Lu Z. Improving chemical disease relation extraction with rich features and weakly labeled data. Journal of Cheminformatics, 2016, 8(1): 53

    Article  Google Scholar 

  17. 17.

    Verga P, Strubell E, McCallum A. Simultaneously self-attending to all mentions for full-abstract biological relation extraction. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). 2018, 872–884

  18. 18.

    Cui L, Wei F, Zhou M. Neural open information extraction. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018, 407–413

  19. 19.

    Reshadat V, Hoorali M, Faili H. A hybrid method for open information extraction based on shallow and deep linguistic analysis. Interdisciplinary Information Sciences, 2016, 22(1): 87–100

    Article  Google Scholar 

  20. 20.

    Reshadat V, Faili H. A new open information extraction system using sentence difficulty estimation. Computing and Informatics, 2019, 38(4): 986–1008

    Article  Google Scholar 

  21. 21.

    Sun M, Li X, Wang X, Fan M, Feng Y, Li P. Logician: a unified end-to-end neural approach for open-domain information extraction. In: Proceedings of the 11th ACM International Conference on Web Search & Data Mining. 2018

  22. 22.

    Chen Y, Xu L, Liu K, Zeng D, Zhao J. Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015, 167–176

  23. 23.

    Blunsom P, Freitas d N, Grefenstette E, Hermann K M. A deep architecture for semantic parsing. In: Proceedings of the ACL 2014 Workshop on Semantic Parsing. 2014

  24. 24.

    Liang C, Berant J, Le Q, Forbus K D, Lao N. Neural symbolic machines: learning semantic parsers on freebase with weak supervision. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017, 23–33

  25. 25.

    Wang Y, Berant J, Liang P. Building a semantic parser overnight. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015, 1332–1342

  26. 26.

    Xiao C, Dymetman M, Gardent C. Sequence-based structured prediction for semantic parsing. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016, 1341–1350

  27. 27.

    Berant J, Chou A, Frostig R, Liang P. Semantic parsing on freebase from question-answer pairs. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. 2013, 1533–1544

  28. 28.

    Hershcovich D, Abend O, Rappoport A. A transition-based directed acyclic graph parser for UCCA. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017, 1127–1138

  29. 29.

    Zhu X, Sobihani P, Guo H. Long short-term memory over recursive structures. In: Proceedings of the 32nd International Conference on Machine Learning. 2015, 1604–1612

  30. 30.

    Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015, 1556–1566

  31. 31.

    Agerri R, Rigau G. Robust multilingual named entity recognition with shallow semi-supervised features. Artificial Intelligence, 2016, 238: 63–82

    MathSciNet  Article  Google Scholar 

  32. 32.

    Aguilar J, Beller C, McNamee P, Van Durme B, Strassel S, Song Z, Ellis J. A comparison of the events and relations across ACE, ERE, TAC-KBP, and framenet annotation standards. In: Proceedings of the 2nd Workshop on EVENTS: Definition, Detection, Coreference, and Representation. 2014, 45–53

  33. 33.

    Girju R, Nakov P, Nastase V, Szpakowicz S, Turney P, Yuret D. Semeval-2007 task 04: classification of semantic relations between nominals. In: Proceedings of the 4th International Workshop on Semantic Evaluations. 2007, 13–18

  34. 34.

    Hendrickx I, Kim S N, Kozareva Z, Nakov P, Ó Séaghdha D, Padó S, Pennacchiotti M, Romano L, Szpakowicz S. Semeval-2010 task 8: multi-way classification of semantic relations between pairs of nominals. In: Proceedings of the Workshop on Semantic Evaluations: Recent Achievements and Future Directions. 2009, 94–99

  35. 35.

    Wang Y, Liu X, Shi S. Deep neural solver for math word problems. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017, 845–854

  36. 36.

    Wang L, Zhang D, Zhang J, Xu X, Gao L, Dai B T, Shen H T. Template-based math word problem solvers with recursive neural networks. In: Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019, 7144–7151

  37. 37.

    Zeng D, Liu K, Chen Y, Zhao J. Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015, 1753–1762

  38. 38.

    Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation, 1997, 9(8): 1735–1780

    Article  Google Scholar 

  39. 39.

    Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015, 1412–1421

  40. 40.

    Cao Y, Li H, Luo P, Yao J. Towards automatic numerical cross-checking: extracting formulas from text. In: Proceedings of the 2018 World Wide Web Conference. 2018, 1795–1804

  41. 41.

    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I. Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 6000–6010

  42. 42.

    Zeiler M D. Adadelta: an adaptive learning rate method. 2012, arXiv preprint arXiv: 1212.5071

  43. 43.

    Huang D, Yao J, Lin C, Zhou Q, Yin J. Using intermediate representations to solve math word problems. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018, 419–428

  44. 44.

    Gu J, Lu Z, Li H, Li V O. Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016, 1631–1640

  45. 45.

    Klein G, Kim Y, Deng Y, Senellart J, Rush A. OpenNMT: open-source toolkit for neural machine translation. In: Proceedings of Annual Meeting of the Association for Computational Linguistics, System Demonstrations. 2017, 67–72

  46. 46.

    Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B. Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017, 1227–1236

  47. 47.

    Cao Y, Chen D, Li H, Luo P. Nested relation extraction with iterative neural network. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. 2019, 1001–1010

Download references

Acknowledgements

The research work was partially supported by the National Key Research and Development Program of China (2017YFB1002104), the National Natural Science Foundation of China (Grant No. U1811461), and the Innovation Program of Institute of Computing Technology, CAS.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Ping Luo.

Additional information

A preliminary version of this work has been published in the Proceedings of the 28th ACM International Conference on Information and Knowledge Management (CIKM) [47]

Yixuan Cao received the BE degree in transportation engineering from Tongji University, China in 2015 and now is a PhD student at the Institute of Computing Technology, Chinese Academy of Sciences, China. His research interests include natural language processing and information extraction.

Dian Chen received the BE degree in IoT Engineering from ChongQing University, China in 2016 and now is a PhD student at the Institute of Computing Technology, Chinese Academy of Sciences, China. His research interests focus on Natural Language Processing, Deep Learning and Data Mining.

Zhengqi Xu received the BE degree in remote sensing from Beihang University, China in 2019 and now is an MS student at the Institute of Computing Technology, Chinese Academy of Sciences, China. His research interests focus on machine learning, information retrieval and information extraction.

Hongwei Li received the BE degree in software engineering from Fuzhou University, China in 2015 and now is a PhD student at the Institute of Computing Technology, Chinese Academy of Sciences, China. His research interests focus on machine learning, natural language processing and information extraction.

Ping Luo received the PhD degree in computer science from the Institute of Computing Technology, Chinese Academy of Sciences, China. He is an associate professor in the Institute of Computing Technology, Chinese Academy of Science (CAS), China. His general area of research is knowledge discovery and machine learning.

Electronic supplementary material

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Cao, Y., Chen, D., Xu, Z. et al. Nested relation extraction with iterative neural network. Front. Comput. Sci. 15, 153323 (2021). https://doi.org/10.1007/s11704-020-9420-6

Download citation

Keywords

  • nested relation extraction
  • mention insensitive relation
  • iterative neural network