Skip to main content

Attention-Based English to Mizo Neural Machine Translation

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1241))

Abstract

Machine Translation alleviates the need of human translators for source to target languages translation by enabling instant translation in multiple languages. Neural Machine Translation (NMT) has exhibited remarkable results in case of high-resource languages. However, for resource scare languages, NMT does not perform equivalently well. In this paper, various NMT models based on different configurations such as unidirectional and bidirectional Long Short Term Memory (LSTM), deep and shallow networks and optimization methods like Stochastic Gradient Descent (SGD) and Adam has been trained and tested for resource scare English to Mizo language pair. The quality of output translations have been evaluated using automatic evaluation metrics and analyzed the predicted translations based on best and worst performances of test data.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://en.wikipedia.org/wiki/Mizo_language/.

References

  1. Premjith, B., Kumar, M.A., Soman, K.P..: Neural machine translation system for English to Indian language translation using MTIL parallel corpus: special issue on natural language processing. J. Intell. Syst. 28 (2019). https://doi.org/10.1515/jisys-2019-2510

  2. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7–9, 2015, Conference Track Proceedings (2015)

    Google Scholar 

  3. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1724–1734. Association for Computational Linguistics, Doha (2014). https://doi.org/10.3115/v1/D14-1179

  4. Das, A., Yerra, P., Kumar, K., Sarkar, S.: A study of attention-based neural machine translation model on Indian languages. In: Proceedings of the 6th Workshop on South and Southeast Asian Natural Language Processing (WSSANLP2016), pp. 163–172. The COLING 2016 Organizing Committee, Osaka (2016)

    Google Scholar 

  5. Denkowski, M., Lavie, A.: Meteor universal: language specific translation evaluation for any target language. In: Proceedings of the Ninth Workshop on Statistical Machine Translation, pp. 376–380. Association for Computational Linguistics, Baltimore (2014). https://doi.org/10.3115/v1/W14-3348, https://www.aclweb.org/anthology/W14-3348

  6. Imankulova, A., Sato, T., Komachi, M.: Filtered pseudo-parallel corpus improves low-resource neural machine translation. ACM Trans. Asian Low Resour. Lang. Inf. Process. 19(2), 1–16 (2019). https://doi.org/10.1145/3341726

    Article  Google Scholar 

  7. Klein, G., Kim, Y., Deng, Y., Senellart, J., Rush, A.: OpenNMT: open-source toolkit for neural machine translation. In: Proceedings of ACL 2017 System Demonstrations, pp. 67–72. Association for Computational Linguistics, Vancouver (2017)

    Google Scholar 

  8. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421. Association for Computational Linguistics, Lisbon (2015). https://doi.org/10.18653/v1/D15-1166

  9. Mahata, S.K., Mandal, S., Das, D., Bandyopadhyay, S.: Code-mixed to monolingual translation framework. In: Proceedings of the 11th Forum for Information Retrieval Evaluation, FIRE 2019, pp. 30–35. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3368567.3368579

  10. Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, ACL 2002, pp. 311–318. Association for Computational Linguistics, Stroudsburg (2002). https://doi.org/10.3115/1073083.1073135

  11. Pathak, A., Pakray, P.: Neural machine translation for indian languages. J. Intell. Syst. 28(3), 465–477 (2019). https://doi.org/10.1515/jisys-2018-0065

    Article  Google Scholar 

  12. Pathak, A., Pakray, P., Bentham, J.: English–Mizo Machine Translation using neural and statistical approaches. Neural Comput. Appl. 31(11), 7615–7631 (2018). https://doi.org/10.1007/s00521-018-3601-3

    Article  Google Scholar 

  13. Ramesh, S.H., Sankaranarayanan, K.P.: Neural machine translation for low resource languages using bilingual lexicon induced from comparable corpora. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, pp. 112–119. Association for Computational Linguistics, New Orleans (2018). https://doi.org/10.18653/v1/N18-4016

  14. Singh, S., Panjwani, R., Kunchukuttan, A., Bhattacharyya, P.: Comparing recurrent and convolutional architectures for English-Hindi neural machine translation. In: Proceedings of the 4th Workshop on Asian Translation (WAT 2017), pp. 167–170. Asian Federation of Natural Language Processing, Taipei (2017)

    Google Scholar 

  15. Snover, M., Dorr, B., Schwartz, R., Micciulla, L., Makhoul, J.: A study of translation edit rate with targeted human annotation. In: Proceedings of Association for Machine Translation in the Americas, pp. 223–231 (2006)

    Google Scholar 

  16. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, NIPS 14, vol. 2, pp. 3104–3112. MIT Press, Cambridge (2014)

    Google Scholar 

  17. Zaremoodi, P., Buntine, W., Haffari, G.: Adaptive knowledge sharing in multi-task learning: Improving low-resource neural machine translation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 2: Short Papers, pp. 656–661. Association for Computational Linguistics, Melbourne (2018). https://doi.org/10.18653/v1/P18-2104

  18. Zhang, B., Xiong, D., Su, J., Duan, H.: A context-aware recurrent encoder for neural machine translation. IEEE/ACM Trans. Audio Speech Lang. Process. 25(12), 2424–2432 (2017). https://doi.org/10.1109/TASLP.2017.2751420

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Candy Lalrempuii .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lalrempuii, C., Soni, B. (2020). Attention-Based English to Mizo Neural Machine Translation. In: Bhattacharjee, A., Borgohain, S., Soni, B., Verma, G., Gao, XZ. (eds) Machine Learning, Image Processing, Network Security and Data Sciences. MIND 2020. Communications in Computer and Information Science, vol 1241. Springer, Singapore. https://doi.org/10.1007/978-981-15-6318-8_17

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-6318-8_17

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-6317-1

  • Online ISBN: 978-981-15-6318-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics