Skip to main content
Log in

Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Question answering (QA) over knowledge base (KB) aims to provide a structured answer from a knowledge base to a natural language question. In this task, a key step is how to represent and understand the natural language query. In this paper, we propose to use tree-structured neural networks constructed based on the constituency tree to model natural language queries. We identify an interesting observation in the constituency tree: different constituents have their own semantic characteristics and might be suitable to solve different subtasks in a QA system. Based on this point, we incorporate the type information as an auxiliary supervision signal to improve the QA performance. We call our approach type-aware QA. We jointly characterize both the answer and its answer type in a unified neural network model with the attention mechanism. Instead of simply using the root representation, we represent the query by combining the representations of different constituents using task-specific attention weights. Extensive experiments on public datasets have demonstrated the effectiveness of our proposed model. More specially, the learned attention weights are quite useful in understanding the query. The produced representations for intermediate nodes can be used for analyzing the effectiveness of components in a QA system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Yao X C, van Durme B. Information extraction over structured data: Question answering with Freebase. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.956-966.

  2. Yao X C. Lean question answering over Freebase from scratch. In Proc. North American Chapter of the Association for Computational Linguistics, May 31-June 5, 2015, pp.66-70.

  3. Berant J, Chou A, Frostig R, Liang P. Semantic parsing on Freebase from question-answer pairs. In Proc. the Conf. Empirical Methods in Natural Language Processing, Volume 2, Oct. 2013.

  4. Fader A, Zettlemoyer L, Etzioni O. Paraphrase-driven learning for open question answering. In Proc. the 51st Annual Meeting of the Association for Computational Linguistics Aug. 2013, pp.1608-1618.

  5. Berant J, Liang P. Semantic parsing via paraphrasing. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.1415-1425.

  6. Yih W T, He X D, Meek C. Semantic parsing for singlerelation question answering. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.643-648.

  7. Yih W T, Chang M W, He X D, Gao J F. Semantic parsing via staged query graph generation: Question answering with knowledge base. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, July 2015, pp.1321-1331.

  8. Bordes A, Chopra S, Weston J. Question answering with subgraph embeddings. arXiv: 1406.3676, 2014. https://arxiv.org/abs/1406.3676, May 2017.

  9. Bordes A, Weston J, Usunier N. Open question answering with weakly supervised embedding models. In Joint European Conf. Machine Learning and Knowledge Discovery in Databases, Calders T, Esposito F, Hüllermeier E, Meo R (eds.), Springer, 2014, pp.165-180.

  10. Bordes A, Usunier N, Chopra S, Weston J. Large-scale simple question answering with memory networks. arXiv: 1506.02075, 2015. https://arxiv.org/abs/1506.02075, May 2017.

  11. Zhang Y Z, Liu K, He S Z, Ji G L, Liu Z Y, Wu H, Zhao J. Question answering over knowledge base with neural attention combining global knowledge information. arXiv: 1606.00979, 2016. http://arxiv.org/abs/1606.00979, May 2017.

  12. Golub D, He X D. Character-level question answering with attention. arXiv: 1604.00727, 2016. http://lanl.arxiv.org/abs/1604.00727, May 2017.

  13. Iyyer M, Boyd-Graber J, Claudino L, Socher R, Daumé III H. A neural network for factoid question answering over paragraphs. In Proc. the 2014 Conf. Empirical Methods in Natural Language Processing, Oct. 2014, pp.633-644.

  14. Mou L L, Peng H, Li G, Xu Y, Zhang L, Jin Z. Discriminative neural sentence modeling by tree-based convolution. arXiv: 1504.01106, 2015. https://arxiv.org/abs/15-04.01106, May 2017.

  15. Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. arXiv: 1503.00075, 2015. http://arxiv.org/abs/1503.00075, May 2017.

  16. Bast H, Haussmann E. More accurate question answering on Freebase. In Proc. the 24th ACM Int. Conf. Information and Knowledge Management, Oct. 2015, pp.1431-1440.

  17. Weston J, Chopra S, Bordes A. Memory networks. In Proc. Int. Conf. Learning Representations (ICLR), May 2015.

  18. Sukhbaatar S, Szlam A, Weston J, Fergus R. End-to-end memory networks. In Proc. Advances in Neural Information Processing Systems, Nov. 2015, pp.2431-2439.

  19. Hu B T, Lu Z D, Li H, Chen Q C. Convolutional neural network architectures for matching natural language sentences. In Proc. the 27th Int. Conf. Neural Information Processing Systems, Dec. 2014, pp.2042-2050.

  20. Dong L, Wei F R, Zhou M, Xu K. Question answering over Freebase with multi-column convolutional neural networks. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics, July 2015, pp.260-269.

  21. Yin W P, Yu M, Xiang B, Zhou B W, Schütze H. Simple question answering by attentive convolutional neural network. arXiv: 1606.03391, 2016. http://arxiv.org/abs/16-06.03391, May 2017.

  22. Dai Z H, Li L, Xu W. CFO: Conditional focused neural question answering with large-scale knowledge bases. arXiv: 1606.01994, 2016. https://www.arxiv.org/abs/1606.01994, May 2017.

  23. Socher R, Huang E H, Pennin J, Manning C D, Ng A Y. Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. In Proc. Advances in Neural Information Processing Systems, Dec. 2011, pp.801-809.

  24. Socher R, Perelygin A, Wu J Y, Chuang J, Manning C D, Ng A Y, Potts C. Recursive deep models for semantic compositionality over a sentiment treebank. In Proc. the Conf. Empirical Methods in Natural Language Processing, Oct. 2013, pp.1631-1642.

  25. Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase: A collaboratively created graph database for structuring human knowledge. In Proc. ACM SIGMOD Int. Conf. Management of Data, June 2008, pp.1247-1250.

  26. Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multirelational data. In Proc. Advances in Neural Information Processing Systems, Dec. 2013, pp.2787-2795.

Download references

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiao-Ming Li.

Additional information

Wayne Xin Zhao is Co-First Author

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1

(PDF 221 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yin, J., Zhao, W.X. & Li, XM. Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks. J. Comput. Sci. Technol. 32, 805–813 (2017). https://doi.org/10.1007/s11390-017-1761-8

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-017-1761-8

Keywords

Navigation