Skip to main content

ACA: Attention-Based Context-Aware Answer Selection System

  • Conference paper
  • First Online:
  • 485 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1085))

Abstract

The main goal of question answering system is to develop chatbots capable of answering the questions irrespective of the domain. The system has to provide appropriate answers according to the user queries. The challenge of a question answering system lies in analyzing the question to retrieve the accurate answers from a large amount of data present. The main aim of this paper is to propose a question answering system that analyzes the questions properly and answers according to the type of the question in a precise way. The model uses similarity measures to find the candidate set which are the most relevant answers from the content given. The generated candidate set is then analyzed further using the bidirectional long short-term memory (BiLSTM) to retrieve the appropriate answer. Based on the type of question, we retrieve the exact word/phrase as a response to the user. This is done using an attention mechanism which _nds the exact answer to the query. Results of models without memory and with an attention-based BiLSTM are being compared. Using the attention-based BiLSTM, we found that the retrieved answer had increased accuracy rate by 17% when compared with without-attention models like convolution neural networks (CNN) and recurrent neural networks (RNN).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://rajpurkar.github.io/SQUAD-explorer/explore/1.1/dev/.

References

  1. Parikh, S., Sai, A.B., Nema, P., Khapra, M.M.: Eliminet: A model for eliminating options for reading comprehension with multiple choice questions. arXiv preprint arXiv:1904.02651 (2019)

  2. Najafabadi, M.M., Villanustre, F., Khoshgoftaar, T.M., Seliya, N., Wald, R., Muharemagic, E.: Deep learning applications and challenges in big data analytics. J. Big Data 2(1), 1 (2015)

    Article  Google Scholar 

  3. Ichida, A.Y., Meneguzzi, F., Ruiz, D.D.:. Measuring semantic similarity between sentences using a siamese neural network. In: 2018 International Joint Conference on Neural Networks (IJCNN) IEEE, pp. 1–7, July, 2018

    Google Scholar 

  4. Young, T., Hazarika, D., Poria, S., Cambria, E.: Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13(3), 55–75 (2018)

    Article  Google Scholar 

  5. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  6. Almeida, F., Xexo, G.: Word embeddings: A survey. arXiv preprint arXiv:1901.09069 (2019)

  7. Pennington, J., Socher, R., Manning, C.: Glove: Global vectors for word representation. In: Proceedings of the 2014 conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  8. Liu, S., Zhang, S., Zhanga, X., Wang, H.: R-trans: RNN transformer network for chinese machine reading comprehension. IEEE Access (2019)

    Google Scholar 

  9. Neamah, N., Saad, S.: Question answering system supporting vector machine method for hadith domain. J. Theor. Appl. Inf. Technol. 95(7) (2017)

    Google Scholar 

  10. Kapashi, D., Shah, P.: Answering reading comprehension using memory networks. Report for Stanford University Course CS224d (2015)

    Google Scholar 

  11. Tan, C., Wei, F., Zhou, Q., Yang, N., Du, B., Lv, W., Zhou, M.: Context-aware answer sentence selection with hierarchical gated recurrent neural networks. IEEE/ACM Trans. Audio, Speech, Lang. Process. 26(3), 540–549 (2018)

    Article  Google Scholar 

  12. Zhang, L., Ma, L.: Coattention based BiLSTM for answer selection. In: 2017 IEEE International Conference on Information and Automation (ICIA), IEEE, pp. 1005–1011 (2017)

    Google Scholar 

  13. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  14. Tan, M., Dos Santos, C., Xiang, B., Zhou, B.: Improved representation learning for question answer matching. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 464–473 (2016)

    Google Scholar 

  15. Xiang, Y., Chen, Q., Wang, X., Qin, Y.: Answer selection in community question answering via attentive neural networks. IEEE Signal Process. Lett. 24(4), 505–509 (2017)

    Article  Google Scholar 

  16. Seo, M., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension. arXiv preprint arXiv:1611.01603 (2016)

  17. Skovgaard-Olsen, N., Collins, P., Krzyanowska, K., Hahn, U., Klauer, K.C.: Cancellation, negation, and rejection. Cogn. Psychol. 108, 42–71 (2019)

    Article  Google Scholar 

  18. Blanco, E., Moldovan, D.: Some issues on detecting negation from text. In: Twenty-Fourth International FLAIRS Conference, Mar, 2011

    Google Scholar 

  19. Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: Squad: 100,000+ questions for machine comprehension of text. arXiv preprint arXiv:1606.05250 (2016)

  20. Loper, E., Bird, S.: NLTK: The natural language toolkit. arXiv preprint cs/0205028 (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to K. Sundarakantham .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sundarakantham, K., Felicia Lilian, J., Rajashree, H., Mercy Shalinie, S. (2020). ACA: Attention-Based Context-Aware Answer Selection System. In: Agarwal, S., Verma, S., Agrawal, D. (eds) Machine Intelligence and Signal Processing. MISP 2019. Advances in Intelligent Systems and Computing, vol 1085. Springer, Singapore. https://doi.org/10.1007/978-981-15-1366-4_26

Download citation

Publish with us

Policies and ethics