Abstract
Fine-grained opinion analysis is a very important task, especially identifying opinion target and opinion expression. In this paper, a new neural architecture is proposed for the sentence-level joint extraction of opinion target and opinion expression. The neural architecture namely cascaded model includes pre-trained model BERT Base, linguistic features, bi-directional LSTM, soft attention network and CRF layer from bottom to top. The cascaded model provides the best joint extraction results in the SemEval-2014/2016 Task 4/5 data sets compared with the state-of-the-art. There are three main contributions in our work, (1) attention network is introduced into the task of sentence-level joint extraction of opinion target and opinion expression, which enhances the dependence between opinion target and opinion expression. (2) pre-trained model BERT-Base and linguistic features are introduced into our work, which greatly improve the convergence speed and the performance of the cascaded model. (3) opinion target and opinion expression are synchronously extracted, and achieved better results compared with the most of the existing pipelined methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Wiebe, J., Wilson, T., Cardie, C.: Annotating expressions of opinions and emotions in language. Lang. Resour. Eval. 39(2–3), 165–210 (2005)
Qiu, G., Liu, B., Bu, J., et al.: Opinion word expansion and target extraction through double propagation. Comput. Linguist. 37(1), 9–27 (2011)
Liu, K., Xu, L., Zhao, J.: Extracting opinion targets and opinion words from online reviews with graph co-ranking. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 314–324 (2014)
Culotta, A., Sorensen, J.: Dependency tree kernels for relation extraction. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, p. 423. Association for Computational Linguistics (2004)
Stevenson, M., Greenwood, M.A.: Dependency pattern models for information extraction. Res. Lang. Comput. 7(1), 13 (2009)
Yang, B., Cardie, C.: Joint inference for fine-grained opinion extraction. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1640–1649 (2013)
Liu, P., Joty, S., Meng, H.: Fine-grained opinion mining with recurrent neural networks and word embeddings. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1433–1443 (2015)
Poria, S., Cambria, E., Gelbukh, A.: Aspect extraction for opinion mining with a deep convolutional neural network. Knowl.-Based Syst. 108, 42–49 (2016)
Yao, Y., Huang, Z.: Bi-directional LSTM recurrent neural network for Chinese word segmentation. In: International Conference on Neural Information Processing, pp. 345–353. Springer, Cham (2016)
Devlin, J., Chang, M.W., Lee, K., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)
Chiu, J.P.C., Nichols, E.: Named entity recognition with bidirectional LSTM-CNNs. Trans. Assoc. Comput. Linguist. 4, 357–370 (2016)
Hu, Z., Ma, X., Liu, Z., et al.: Harnessing deep neural networks with logic rules. arXiv preprint arXiv:1603.06318 (2016)
Yao, K., Peng, B., Zhang, Y., et al.: Spoken language understanding using long short-term memory neural networks. In: 2014 IEEE Spoken Language Technology Workshop (SLT), pp. 189–194. IEEE (2014)
Lample, G., Ballesteros, M., Subramanian, S., et al.: Neural architectures for named entity recognition. arXiv preprint arXiv:1603.01360 (2016)
Zheng, S., Wang, F., Bao, H., et al.: Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:1706.05075 (2017)
Gao, S., Young, M.T., Qiu, J.X., et al.: Hierarchical attention networks for information extraction from cancer pathology reports. J. Am. Med. Inform. Assoc. 25(3), 321–330 (2018)
Ding, R., Li, Z.: Event extraction with deep contextualized word representation and multi-attention layer. In: Gan, G., Li, B., Li, X., Wang, S. (eds.) ADMA 2018. LNCS (LNAI), vol. 11323, pp. 189–201. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-05090-0_17
GitHub. https://github.com/google-research/bert#pre-trained-models. Accessed 25 Dec 2018
Al-Badrashiny, M., Bolton, J., Chaganty, A.T., et al.: TinkerBell: cross-lingual cold-start knowledge base construction. In: TAC (2017)
Huang, L., Sil, A., Ji, H., et al.: Improving slot filling performance with attentive neural networks on dependency structures. arXiv preprint arXiv:1707.01075 (2017)
Pontiki, M., Galanis, D., Papageorgiou, H., et al.: SemEval-2016 task 5: aspect based sentiment analysis. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pp. 19–30 (2016)
Ma, X., Hovy, E.: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF. arXiv preprint arXiv:1603.01354 (2016)
Collobert, R., Weston, J., Bottou. L., et al.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(Aug), 2493–2537 (2011)
Acknowledgments
We are grateful to the anonymous reviewers for their insightful comments and suggestions to improve the paper. This research is financially supported by The National Key Research and Development Program of China (No. 2018YFC0704306, No. 2017YFB0803301, No. 2018YFC0704304).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, Q., Hu, Y. (2019). Joint Extraction of Opinion Targets and Opinion Expressions Based on Cascaded Model. In: Nayak, A., Sharma, A. (eds) PRICAI 2019: Trends in Artificial Intelligence. PRICAI 2019. Lecture Notes in Computer Science(), vol 11672. Springer, Cham. https://doi.org/10.1007/978-3-030-29894-4_44
Download citation
DOI: https://doi.org/10.1007/978-3-030-29894-4_44
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-29893-7
Online ISBN: 978-3-030-29894-4
eBook Packages: Computer ScienceComputer Science (R0)