Abstract
Target-dependent sentiment analysis is a fine-grained sentiment analysis and has received an increasing attention. For target-dependent sentiment analysis, the key issue is to capture the important context information according to the given target word. While some critical information in the context may be in a long distance from the target word, so it is significant to explore how to adequately and directly capture these long-range information. The dependency relation can connect words which are relevant in syntax but far in word order. Inspired by this, we propose Dependency-Attention-based Long Short-Term Memory Network (DAT-LSTM) and Segmented Dependency-Attention-based Long Short-Term Memory Network (Seg-DAT-LSTM) for target-dependent sentiment analysis. The dependency-attention mechanism utilizes dependency relation to fully capture long-range information for certain target. Experiments on the tweet dataset and SemEval 2014 dataset indicate that our models achieve state-of-the-art performance on target-dependent sentiment classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends. Inf. Retr. 2(1–2), 1–135 (2008)
Jiang, L., Mo, Y., Zhou, M., Liu, X., Zhao, T.: Target-dependent twitter sentiment classification. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1, pp. 151–160. Association for Computational Linguistics (2011)
Kiritchenko, S., Zhu, X., Cherry, C., Mohammad, S.: NRC-Canada-2014: detecting aspects and sentiment in customer reviews. In: Proceedings of the 8th International Workshop on Semantic Evaluation, pp. 437–442 (2014)
Wagner, J., Arora, P., Cortes, S., Barman, U., Bogdanova, D., Foster, J., Tounsi, L.: DCU: aspect-based polarity classification for semeval task 4 (2014)
Zhang, M., Zhang, Y., Vo, D.-T.: Gated neural networks for targeted sentiment analysis. In: AAAI, pp. 3087–3093 (2016)
Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. In: COLING (2016)
Wang, Y., Huang, M., Zhao, L., Zhu, X.: Attention-based LSTM for aspect-level sentiment classification. In: EMNLP (2016)
Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. In: EMNLP (2016)
Daniluk, M., Rocktaschel, T., Welbl, J., Riedel, S.: Frustratingly short attention spans in neural language modeling. In: ICLR (2017)
Xu, K., Ba, J., Kiros, R., Cho, K., Courville, A., Salakhudinov, R., Zemel, R., Bengio, Y.: Show, attend and tell: neural image caption generation with visual attention. In: International Conference on Machine Learning, pp. 2048–2057 (2015)
Mnih, V., Heess, N., Graves, A.: Recurrent models of visual attention. In: Advances in Neural Information Processing Systems, pp. 2204–2212 (2014)
Bengio, Y., Ducharme, R., Vincent, P., Jauvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)
Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)
Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Ke, X.: Adaptive recursive neural network for target-dependent twitter sentiment classification. In: ACL, vol. 2, pp. 49–54 (2014)
Pontiki, M., Galanis, D., Pavlopoulos, J., Papageorgiou, H., Androutsopoulos, I., Manandhar, S.: Semeval-2014 task 4: aspect based sentiment analysis. In: Proceedings of SemEval, pp. 27–35 (2014)
Ruder, S., Ghaffari, P., Breslin, J.G.: A hierarchical model of reviews for aspect-based sentiment analysis. In: EMNLP (2016)
Lakkaraju, H., Socher, R., Manning, C.: Aspect specific sentiment analysis using hierarchical deep learning. In: NIPS Workshop on Deep Learning and Representation Learning (2014)
Manning, C.D., Surdeanu, M., Bauer, J., Finkel, J.R., Bethard, S., McClosky, D.: The stanford CoreNLP natural language processing toolkit. In: ACL, pp. 55–60 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, X., Chen, G. (2017). Dependency-Attention-Based LSTM for Target-Dependent Sentiment Analysis. In: Cheng, X., Ma, W., Liu, H., Shen, H., Feng, S., Xie, X. (eds) Social Media Processing. SMP 2017. Communications in Computer and Information Science, vol 774. Springer, Singapore. https://doi.org/10.1007/978-981-10-6805-8_17
Download citation
DOI: https://doi.org/10.1007/978-981-10-6805-8_17
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-6804-1
Online ISBN: 978-981-10-6805-8
eBook Packages: Computer ScienceComputer Science (R0)