Advertisement

Generating Textual Entailment Using Residual LSTMs

  • Maosheng GuoEmail author
  • Yu Zhang
  • Dezhi Zhao
  • Ting Liu
Conference paper
  • 1.5k Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10565)

Abstract

Generating textual entailment (GTE) is a recently proposed task to study how to infer a sentence from a given premise. Current sequence-to-sequence GTE models are prone to produce invalid sentences when facing with complex enough premises. Moreover, the lack of appropriate evaluation criteria hinders researches on GTE. In this paper, we conjecture that the unpowerful encoder is the major bottleneck in generating more meaningful sequences, and improve this by employing the residual LSTM network. With the extended model, we obtain state-of-the-art results. Furthermore, we propose a novel metric for GTE, namely EBR (Evaluated By Recognizing textual entailment), which could evaluate different GTE approaches in an objective and fair way without human effort while also considering the diversity of inferences. In the end, we point out the limitation of adapting a general sequence-to-sequence framework under GTE settings, with some proposals for future research, hoping to generate more public discussion.

Keywords

Generating textual entailment Natural language generation Natural language processing Artificial intelligence 

Notes

Acknowledgements

This paper was supported by the National Natural Science Foundation of China (Grant No. 61472105, 61472107), The National High Technology Research and Development Program of China (863 Program) (2015AA015407).

References

  1. 1.
    Bowman, S.R., et al.: A large annotated corpus for learning natural language inference. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics (2015). doi: 10.18653/v1/d15-1075
  2. 2.
    Chen, Q., et al.: Enhancing and Combining Sequential and Tree LSTM for Natural Language Inference (2017). arXiv:160906038 Cs
  3. 3.
    Cho, K., et al.: Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation (2014). arXiv:14061078 Cs Stat. doi: 10.3115/v1/d14-1179
  4. 4.
    Dagan, I., Glickman, O., Magnini, B.: The PASCAL recognising textual entailment challenge. In: Quiñonero-Candela, J., Dagan, I., Magnini, B., d’Alché-Buc, F. (eds.) MLCW 2005. LNCS, vol. 3944, pp. 177–190. Springer, Heidelberg (2006). doi: 10.1007/11736790_9 CrossRefGoogle Scholar
  5. 5.
    He, K., et al.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, CVPR (2016). doi: 10.1109/cvpr.2016.90
  6. 6.
    Jia, J.: The generation of textual entailment with NLML in an intelligent dialogue system for language learning CSIEC. In: International Conference on Natural Language Processing and Knowledge Engineering, NLP-KE 2008, pp. 1–8. IEEE (2008). doi: 10.1109/nlpke.2008.4906806
  7. 7.
    Kolesnyk, V., et al.: Generating Natural Language Inference Chains (2016). arXiv preprint: arXiv:160601404
  8. 8.
    Magnini, B., et al.: The excitement open platform for textual inferences. In: ACL (System Demonstrations), pp. 43–48 (2014). doi: 10.3115/v1/p14-5008
  9. 9.
    Nevěřilová, Z.: Paraphrase and textual entailment generation. In: Sojka, P., Horák, A., Kopeček, I., Pala, K. (eds.) TSD 2014. LNCS, vol. 8655, pp. 293–300. Springer, Cham (2014). doi: 10.1007/978-3-319-10816-2_36 Google Scholar
  10. 10.
    Papineni, K., et al.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, pp. 311–318. Association for Computational Linguistics (2002). doi: 10.3115/1073083.1073135
  11. 11.
    Pennington, J., et al.: Glove: global vectors for word representation. In: EMNLP, pp. 1532–1543 (2014). doi: 10.3115/v1/d14-1162
  12. 12.
    Prakash, A., et al.: Neural Paraphrase Generation with Stacked Residual LSTM Networks (2016). arXiv:161003098 Cs
  13. 13.
    Toderici, G., et al.: Full Resolution Image Compression with Recurrent Neural Networks (2016). arXiv preprint: arXiv:160805148

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.School of Computer Science and TechnologyHarbin Institute of TechnologyHarbinChina

Personalised recommendations