Advertisement

A QA System Based on Bidirectional LSTM with Text Similarity Calculation Model

  • Wenhua Xu
  • Hao Huang
  • Hao Gu
  • Jie Zhang
  • Guan GuiEmail author
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 279)

Abstract

The development of deep learning in recent years has led to the development of natural language processing [1]. Question answering (QA) system is an important branch of natural language processing. It benefits from the application of neural networks and therefore its performance is constantly improving. The application of recurrent neural networks (RNN) and long short-term memory (LSTM) networks are more common in natural language processing. Inspired by the work of machine translation, this paper built an intelligent QA system based on the specific areas of the extension service. After analyzing the shortcomings of the RNN and the advantages of the LSTM network, we choose the bidirectional LSTM. In order to improve the performance, this paper add text similarity calculation in the QA system. At the end of the experiment, the convergence of the system and the accuracy of the answer to the question showed that the performance of the system is good.

Keywords

QA system Deep learning RNN LSTM 

References

  1. 1.
    LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)CrossRefGoogle Scholar
  2. 2.
    Spyns, P.: Natural language processing in medicine: an overview. Methods Inf. Med. 35(04), 285–301 (1996)Google Scholar
  3. 3.
    Turing, A.M.: Computing machinery and intelligence. In: Epstein, R., Roberts, G., Beber, G. (eds.) in Parsing the Turing Test, pp. 23–65. Springer, Dordrecht (2009).  https://doi.org/10.1007/978-1-4020-6710-5_4CrossRefGoogle Scholar
  4. 4.
    Van Merri, B.: Learning phrase representations using RNN encoder–decoder for statistical machine translation. In: Empirical Methods in Natural Language Processing, Doha, Qatar, 25–29 October 2014, pp. 1724–1734 (2014)Google Scholar
  5. 5.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Neural Information Processing Systems, Kuching, Malaysia, 3–6 November 2014, pp. 3104–3112 (2014)Google Scholar
  6. 6.
    Zhao, S.H., Li, J.-Y., Xu, B.-R., et al.: Improved TFIDF-based question similarity algorithm for the community interlocution systems. Trans. Beijing Inst. Technol. 37(9), 982–985 (2017)Google Scholar
  7. 7.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: International Conference on Learning Representations, San Diego, CA, USA, 7–9 May 2015, pp. 1–6 (2015)Google Scholar
  8. 8.
    Tran, K.M., Bisazza, A., Monz, C.: Recurrent memory networks for language modeling. In: North American Chapter of the Association for Computational Linguistics, San Diego, CA, USA, 12–17 June 2016, pp. 321–331 (2016)Google Scholar
  9. 9.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar
  10. 10.
    Bengio, Y., Simard, P.Y., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)CrossRefGoogle Scholar
  11. 11.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  12. 12.
    Greff, K., Srivastava, R.K., Koutnik, J., Steunebrink, B.R., Schmidhuber, J.: LSTM: A Search Space Odyssey. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2222–2232 (2017)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Graves, A., Fernandez, S., Schmidhuber, J.: Bidirectional LSTM networks for improved phoneme classification and recognition. In: International Conference on Artificial Neural Networks, Warsaw, Poland, 11–15 September 2005, pp. 799–804 (2005)Google Scholar

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2019

Authors and Affiliations

  • Wenhua Xu
    • 1
  • Hao Huang
    • 1
  • Hao Gu
    • 1
  • Jie Zhang
    • 1
  • Guan Gui
    • 1
    Email author
  1. 1.College of Telecommunication and Information EngineeringNanjing University of Posts and TelecommunicationsNanjingChina

Personalised recommendations