Skip to main content

A New Dependency Parsing Tree Generation Algorithm Based on the Semantic Dependency Relationship Between Words

  • Conference paper
  • First Online:
Book cover Cloud Computing and Security (ICCCS 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11065))

Included in the following conference series:

Abstract

In this paper it presents a new dependency parsing tree (DPT) generation algorithm. Different from other similar algorithms, which based on statistical probability model, the algorithm converts the dependency parsing tree generation problem into a semantic segments dividing problem. In this paper, the co-occurrence frequency of words is firstly analyzed, and it is pointed out that the co-occurrence frequency of words can be used as the basis for the judgment of semantic dependence relationship between words. Then it further analyzes the change of co-occurrence frequency entropy of words in a semantic unit (sentence is used as the basic semantic unit in this paper). And we present an algorithm to divide a sentence into semantic fragments in which words has tight semantic relationship with each other. Based on the above work, this paper divides the DPT generation algorithm into three steps. The first step is to divide the sentence into semantic fragments. The second step is to distinguish semantic core word and non-semantic core words according to the semantic dependency relationship between words in a semantic fragment. Then in the last step the DPT is generated according semantic dependency relationship between semantic core words. Based on court documents which collected from web, the experiments of our DPT generation algorithm are conducted in this paper. And the results show that the DPT generation algorithm in this paper maintains a high degree of consistency with the DPT tree generated by human.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nivre, J.: Dependency grammar and dependency parsing. MSI Rep. 2005(5133), 1–32 (1959)

    Google Scholar 

  2. Tesnière, L.: Eléments de syntaxe structurale (1965)

    Google Scholar 

  3. Hudson, R.A.: Word Grammar. Blackwell, Oxford (1984)

    Google Scholar 

  4. Sgall, P., Hajicová, E., Panevová, J.: The Meaning of the Sentence in its Semantic and Pragmatic Aspects. Springer, Heidelberg (1986)

    Google Scholar 

  5. Eisner, J.M.: Three new probabilistic models for dependency parsing: an exploration. In: Proceedings of the 16th Conference on Computational Linguistics, vol. 1, pp. 340–345. Association for Computational Linguistics (1996)

    Google Scholar 

  6. Carreras, X.: Experiments with a higher-order projective dependency parser. In: Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL) (2007)

    Google Scholar 

  7. McDonald, R., Pereira, F.: Online learning of approximate dependency parsing algorithms. In: 11th Conference of the European Chapter of the Association for Computational Linguistics (2006)

    Google Scholar 

  8. Nivre, J., Scholz, M.: Deterministic dependency parsing of English text. In: Proceedings of the 20th International Conference on Computational Linguistics, p. 64. Association for Computational Linguistics (2004)

    Google Scholar 

  9. Yamada, H., Matsumoto, Y.: Statistical dependency analysis with support vector machines. In: Proceedings of IWPT, vol. 3, pp. 195–206 (2003)

    Google Scholar 

  10. Bohnet, B., McDonald, R., Pitler, E., et al.: Generalized transition-based dependency parsing via control parameters. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 150–160 (2016)

    Google Scholar 

  11. Charniak, E.: A maximum-entropy-inspired parser. In: Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference, pp. 132–139. Association for Computational Linguistics (2000)

    Google Scholar 

  12. Lafferty, J., McCallum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jin Han .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Han, J., Xu, W.L., Jing, Y.T. (2018). A New Dependency Parsing Tree Generation Algorithm Based on the Semantic Dependency Relationship Between Words. In: Sun, X., Pan, Z., Bertino, E. (eds) Cloud Computing and Security. ICCCS 2018. Lecture Notes in Computer Science(), vol 11065. Springer, Cham. https://doi.org/10.1007/978-3-030-00012-7_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-00012-7_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-00011-0

  • Online ISBN: 978-3-030-00012-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics