Advertisement

Attention-Based English to Mizo Neural Machine Translation

  • Candy LalrempuiiEmail author
  • Badal Soni
Conference paper
  • 51 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1241)

Abstract

Machine Translation alleviates the need of human translators for source to target languages translation by enabling instant translation in multiple languages. Neural Machine Translation (NMT) has exhibited remarkable results in case of high-resource languages. However, for resource scare languages, NMT does not perform equivalently well. In this paper, various NMT models based on different configurations such as unidirectional and bidirectional Long Short Term Memory (LSTM), deep and shallow networks and optimization methods like Stochastic Gradient Descent (SGD) and Adam has been trained and tested for resource scare English to Mizo language pair. The quality of output translations have been evaluated using automatic evaluation metrics and analyzed the predicted translations based on best and worst performances of test data.

Keywords

Machine translation Neural machine translation Mizo language 

References

  1. 1.
    Premjith, B., Kumar, M.A., Soman, K.P..: Neural machine translation system for English to Indian language translation using MTIL parallel corpus: special issue on natural language processing. J. Intell. Syst. 28 (2019).  https://doi.org/10.1515/jisys-2019-2510
  2. 2.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7–9, 2015, Conference Track Proceedings (2015)Google Scholar
  3. 3.
    Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1724–1734. Association for Computational Linguistics, Doha (2014).  https://doi.org/10.3115/v1/D14-1179
  4. 4.
    Das, A., Yerra, P., Kumar, K., Sarkar, S.: A study of attention-based neural machine translation model on Indian languages. In: Proceedings of the 6th Workshop on South and Southeast Asian Natural Language Processing (WSSANLP2016), pp. 163–172. The COLING 2016 Organizing Committee, Osaka (2016)Google Scholar
  5. 5.
    Denkowski, M., Lavie, A.: Meteor universal: language specific translation evaluation for any target language. In: Proceedings of the Ninth Workshop on Statistical Machine Translation, pp. 376–380. Association for Computational Linguistics, Baltimore (2014).  https://doi.org/10.3115/v1/W14-3348, https://www.aclweb.org/anthology/W14-3348
  6. 6.
    Imankulova, A., Sato, T., Komachi, M.: Filtered pseudo-parallel corpus improves low-resource neural machine translation. ACM Trans. Asian Low Resour. Lang. Inf. Process. 19(2), 1–16 (2019).  https://doi.org/10.1145/3341726CrossRefGoogle Scholar
  7. 7.
    Klein, G., Kim, Y., Deng, Y., Senellart, J., Rush, A.: OpenNMT: open-source toolkit for neural machine translation. In: Proceedings of ACL 2017 System Demonstrations, pp. 67–72. Association for Computational Linguistics, Vancouver (2017)Google Scholar
  8. 8.
    Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421. Association for Computational Linguistics, Lisbon (2015).  https://doi.org/10.18653/v1/D15-1166
  9. 9.
    Mahata, S.K., Mandal, S., Das, D., Bandyopadhyay, S.: Code-mixed to monolingual translation framework. In: Proceedings of the 11th Forum for Information Retrieval Evaluation, FIRE 2019, pp. 30–35. Association for Computing Machinery, New York (2019).  https://doi.org/10.1145/3368567.3368579
  10. 10.
    Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, ACL 2002, pp. 311–318. Association for Computational Linguistics, Stroudsburg (2002).  https://doi.org/10.3115/1073083.1073135
  11. 11.
    Pathak, A., Pakray, P.: Neural machine translation for indian languages. J. Intell. Syst. 28(3), 465–477 (2019).  https://doi.org/10.1515/jisys-2018-0065CrossRefGoogle Scholar
  12. 12.
    Pathak, A., Pakray, P., Bentham, J.: English–Mizo Machine Translation using neural and statistical approaches. Neural Comput. Appl. 31(11), 7615–7631 (2018).  https://doi.org/10.1007/s00521-018-3601-3CrossRefGoogle Scholar
  13. 13.
    Ramesh, S.H., Sankaranarayanan, K.P.: Neural machine translation for low resource languages using bilingual lexicon induced from comparable corpora. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, pp. 112–119. Association for Computational Linguistics, New Orleans (2018).  https://doi.org/10.18653/v1/N18-4016
  14. 14.
    Singh, S., Panjwani, R., Kunchukuttan, A., Bhattacharyya, P.: Comparing recurrent and convolutional architectures for English-Hindi neural machine translation. In: Proceedings of the 4th Workshop on Asian Translation (WAT 2017), pp. 167–170. Asian Federation of Natural Language Processing, Taipei (2017)Google Scholar
  15. 15.
    Snover, M., Dorr, B., Schwartz, R., Micciulla, L., Makhoul, J.: A study of translation edit rate with targeted human annotation. In: Proceedings of Association for Machine Translation in the Americas, pp. 223–231 (2006)Google Scholar
  16. 16.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, NIPS 14, vol. 2, pp. 3104–3112. MIT Press, Cambridge (2014)Google Scholar
  17. 17.
    Zaremoodi, P., Buntine, W., Haffari, G.: Adaptive knowledge sharing in multi-task learning: Improving low-resource neural machine translation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 2: Short Papers, pp. 656–661. Association for Computational Linguistics, Melbourne (2018).  https://doi.org/10.18653/v1/P18-2104
  18. 18.
    Zhang, B., Xiong, D., Su, J., Duan, H.: A context-aware recurrent encoder for neural machine translation. IEEE/ACM Trans. Audio Speech Lang. Process. 25(12), 2424–2432 (2017).  https://doi.org/10.1109/TASLP.2017.2751420CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.National Institute of Technology SilcharSilcharIndia

Personalised recommendations