Advertisement

Effective Approach to Joint Training of POS Tagging and Dependency Parsing Models

  • Xuan-Dung DoanEmail author
  • Tu-Anh Tran
  • Le-Minh Nguyen
Conference paper
  • 11 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1215)

Abstract

We propose a joint model for POS tagging and dependency parsing. Our model consists of a BiLSTM-CNN-CRF-based POS tagger [26] and a Deep Biaffine Attention-based dependency parser [24]. A combined objective function is used to jointly train both models. Experiment results show very competitive performance on several languages of the Universal Dependencies (UD) v2.2 Treebanks [11].

Keywords

Dependency parsing Biaffine Attention Joint training 

References

  1. 1.
    Taskar, B., Chatalbashev, V., Koller, D., Guestrin, C.: Learning structured prediction models: a large margin approach. In: Proceedings of the Twenty-Second International Conference on Machine Learning (ICML 2005), Bonn, Germany, August 7–11, 2005, pp. 896–903 (2005)Google Scholar
  2. 2.
    Sutton, C., McCallum, A.: An introduction to conditional random fields for relational learning (2006)Google Scholar
  3. 3.
    Dyer, C., Ballesteros, M., Ling, W., Matthews, A., Smith, N.A.: Transition-based dependency parsing with stack long short-term memory. In Proceedings of ACL-2015, Long Papers, vol. 1, pp. 334–343, Beijing (2015)Google Scholar
  4. 4.
    Fernández-González, D., Gómez-Rodríguez, C.: Left-to-right dependency parsing with pointer networks. In: Proceedings of the: Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT 2019), p. 2019, Minneapolis (2019)Google Scholar
  5. 5.
    Chen, D., Manning, C.: A fast and accurate dependency parser using neural networks. In: Proceedings of EMNLP-2014, Doha, Qatar, pp. 740–750 (2014)Google Scholar
  6. 6.
    Nguyen, D.Q., Dras, M., Johnson, M.: A novel neural network model for joint pos tagging and graph-based dependency parsing. In: Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies (CoNLL), pp. 134–142 (2017)Google Scholar
  7. 7.
    Nguyen, D.Q., Verspoor, K.: An improved neural network model for joint POS tagging and dependency parsing. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies (CoNLL), pp. 81–91 (2018)Google Scholar
  8. 8.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of ICLR-2015 (2015)Google Scholar
  9. 9.
    Kiperwasser, E., Goldberg, Y.: Simple and accurate dependency parsing using bidirectional lstm feature representations. Trans. Assoc. Comput. Linguist. 4, 313–327 (2016)CrossRefGoogle Scholar
  10. 10.
    Eisner, J.M.: Three new probabilistic models for dependency parsing: an exploration. In Proceedings of COLING, pp. 340–345 (1996)Google Scholar
  11. 11.
    Nivre, J., Abrams, M., et al.: Universal dependencies 2.2 (2018). http://hdl.handle.net/11234/12837
  12. 12.
    Nivre, J.: An efficient algorithm for projective dependency parsing. In: Proceedings of the 8th International Workshop on Parsing Technologies (IWPT), pp. 149–160 (2003)Google Scholar
  13. 13.
    Lafferty, J., McCallum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: Proceedings of ICML-2001, vol. 951, pp. 282–289 (2001)Google Scholar
  14. 14.
    Hashimoto, K., Xiong, C., Tsuruoka, Y., Socher, R.: A joint many-task model: growing a neural network for multiple NLP tasks. In: The 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 2017) (2017)Google Scholar
  15. 15.
    Van Nguyen, K., Nguyen, N.L.T.: Error analysis for vietnamese dependency parsing. In: The 7th International Conference on Knowledge and System Engineering (KSE), Hochiminh, Vietnam, vol. 10 (2015)Google Scholar
  16. 16.
    Caruana, R.: Multitask learning. Mach. Learn. 28(1), 41–75 (1997)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)zbMATHGoogle Scholar
  18. 18.
    McDonald, R., Pereira, F.: Online learning of approximate dependency parsing algorithms. In: Proceedings of EACL, pp. 81–88 (2006)Google Scholar
  19. 19.
    McDonald, R., Crammer, K., Pereira, F.: Online large-margin training of dependency parsers. In: Proceedings of ACL, pp. 91–98 (2005)Google Scholar
  20. 20.
    McDonald, R., Nivre, J.: Analyzing and integrating dependency parsers. Comput. Linguist. 37(1), 197–230 (2011)CrossRefGoogle Scholar
  21. 21.
    Ruder, S.: An overview of multi-task learning in deep neural networks. arXiv preprint arXiv:1706.05098 (2017)
  22. 22.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  23. 23.
    Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of EMNLP-2015, Lisbon, Portugal, pp. 1412–1421 (2015)Google Scholar
  24. 24.
    Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing. In: Proceedings of ICLR-2017, Long Papers, Toulon, France, vol. 1 (2017)Google Scholar
  25. 25.
    Ma, X., Hu, Z., Liu, J., Peng, N., Neubig, G., Hovy, E.H.: Stack-pointer networks for dependency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, July 15–20, 2018, Long Papers, vol. 1, pp. 1403–1414 (2018)Google Scholar
  26. 26.
    Ma, X., Hovy, E.: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (ACL 2016), Berlin, Germany, pp. 1064–1074 (August 2016)Google Scholar
  27. 27.
    LeCun, Y., et al.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1, 541–551 (1989)CrossRefGoogle Scholar
  28. 28.
    Li, Z., Zhang, M., Che, W., Liu, T., Chen, W., Li, H.: Joint models for Chinese POS tagging and dependency parsing. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing (EMNLP-2011), Edinburgh, Scotland, UK, July 2011, pp. 1180–1191 (2011)Google Scholar
  29. 29.
    Ahmad, W.U., Zhang, Z., Ma, X., Hovy, E., Chang, K.-W., Peng, N.: On difficulties of cross-lingual transfer with order differences: a case study on dependency parsing. In: NAACL (2019)Google Scholar
  30. 30.
    Che, W., Liu, Y., Wang, Y., Zheng, B., Liu, T.: Towards better UD parsing: deep contextualized word embeddings, ensemble, and treebank concatenation. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pp. 55–64 (2018)Google Scholar
  31. 31.
    Wang, W., Chang, B., Mansur, M.: Improved dependency parsing using implicit word connections learned from unlabeled data. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2857–2863 (2018)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Viettel Cyberspace Center, Viettel GroupHanoiVietnam
  2. 2.Japan Advanced Institute of Science and TechnologyIshikawaJapan

Personalised recommendations