Advertisement

Phrase-Based Chinese-Vietnamese Pseudo-Parallel Sentence Pair Generation

  • Jiaxin Zhai
  • Zhengtao YuEmail author
  • Shengxiang Gao
  • Zhenhan Wang
  • Liuqing Pu
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1104)

Abstract

The lack of Chinese-Vietnamese parallel corpus has resulted in poor translation of Chinese-Vietnamese neural machine translation. In order to solve this problem, we propose a phrase-based Chinese-Vietnamese pseudo-parallel sentence pair generation method. This method expands the corpus of Chinese-Vietnamese neural machine translation and improves the performance of Chinese-Vietnamese neural machine translation. Firstly, based on the small-scale Chinese-Vietnamese parallel corpus, the method selects the phrase module according to the phrase syntactic structure information. Then this method combines word alignment information with replacement rules. Finally, the method achieves the expansion of Chinese-Vietnamese pseudo-parallel corpus. Experiments show that this method can effectively generate Chinese-Vietnamese pseudo-parallel sentence pairs and improve the performance of Chinese-Vietnamese neural machine translation.

Keywords

Phrase structure syntax Phrase replacement Pseudo-parallel sentence pair generation Chinese-Vietnamese Neural machine translation 

Notes

Acknowledgements

The work was supported by National key research and development plan project (Grant Nos. 2018YFC0830105, 2018YFC0830100), National Natural Science Foundation of China (Grant Nos. 61732005, 61672271, 61761026, and 61762056), Yunnan high-tech industry development project (Grant No. 201606), and Natural Science Foundation of Yunnan Province (Grant No. 2018FB104).

References

  1. 1.
    Sennrich, R., Haddow, B., Birch, A.: Improving Neural Machine Translation Models with Monolingual Data. arXiv preprint arXiv:1511.06709 (2015)
  2. 2.
    He, W., Zhao, S.Q., Wang, H.F., et al.: Enriching SMT training data via paraphrasing. In: Proceedings of the 5th International Joint Conference on Natural Language Processing (IJCNLP), 8–13 November 2011, Chiang Mai, Thailand, pp. 803–810 (2011)Google Scholar
  3. 3.
    Bond, F., Nichols, E., Appling, D.S., et al.: Improving statistical machine translation by paraphrasing the training data. In: Proceedings of the International Workshop on Spoken Language Translation (IWSLT), 20–21 October 2008, Honolulu, Hawaii, USA, pp. 150–157 (2008)Google Scholar
  4. 4.
    Nakov, P.: Improved statistical machine translation using monolingual paraphrases. In: Proceedings of the 18th European Conference on Artificial Intelligence (ECAI), 21–25 July 2008, Patras, Greece, pp. 338–342 (2008)Google Scholar
  5. 5.
    He, W., Liu, T.: Parse-realize based paraphrasing and SMT corpus enriching. J. Harbin Inst. Technol. 45(5), 45–50 (2013)Google Scholar
  6. 6.
    Fadaee, M., Bisazza, A., Monz, C.: Data Augmentation for Low-Resource Neural Machine Translation. arXiv preprint arXiv:1705.00440 (2017)
  7. 7.
    Cai, Z.L., Yang, M.M., Xiong, D.Y.: Data augmentation for neural machine translation. J. Chin. Inf. Process. 32(7) (2018)Google Scholar
  8. 8.
    Pinker, S.: The Language Instinct: How the Mind Creates Language, pp. 101–105. Penguin, UK (2003)Google Scholar
  9. 9.
    Manning, C.D., Mihai, S., John, B., et al.: The Stanford CoreNLP natural language processing toolkit. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (ACL), 22–27 June 2014, Baltimore, MD, USA, pp. 55–60 (2014)Google Scholar
  10. 10.
    Och, F.J.: Giza++: training of statistical translation models (2001). http://www.informatik.rwth-aachen.de/Colleagues/och/ software/GIZA++.html
  11. 11.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate (2014). arXiv:1409.0473
  12. 12.
    Wu, Y., Schuster, M., Chen, Z., et al.: Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation. arXiv preprint arXiv: 1609.08144 (2016)Google Scholar
  13. 13.
    Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017 (NISP), 4–9 December 2017, Long Beach, CA, USA, pp. 6000–6010 (2017)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  • Jiaxin Zhai
    • 1
    • 2
  • Zhengtao Yu
    • 1
    • 2
    Email author
  • Shengxiang Gao
    • 1
    • 2
  • Zhenhan Wang
    • 1
    • 2
  • Liuqing Pu
    • 1
    • 2
  1. 1.School of Information Engineering and AutomationKunming University of Science and TechnologyKunmingChina
  2. 2.Artificial Intelligent Key Laboratory of Yunnan ProvinceKunming University of Science and TechnologyKunmingChina

Personalised recommendations