Skip to main content

Joint Extraction of Opinion Targets and Opinion Expressions Based on Cascaded Model

  • Conference paper
  • First Online:
PRICAI 2019: Trends in Artificial Intelligence (PRICAI 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11672))

Included in the following conference series:

  • 2648 Accesses

Abstract

Fine-grained opinion analysis is a very important task, especially identifying opinion target and opinion expression. In this paper, a new neural architecture is proposed for the sentence-level joint extraction of opinion target and opinion expression. The neural architecture namely cascaded model includes pre-trained model BERT Base, linguistic features, bi-directional LSTM, soft attention network and CRF layer from bottom to top. The cascaded model provides the best joint extraction results in the SemEval-2014/2016 Task 4/5 data sets compared with the state-of-the-art. There are three main contributions in our work, (1) attention network is introduced into the task of sentence-level joint extraction of opinion target and opinion expression, which enhances the dependence between opinion target and opinion expression. (2) pre-trained model BERT-Base and linguistic features are introduced into our work, which greatly improve the convergence speed and the performance of the cascaded model. (3) opinion target and opinion expression are synchronously extracted, and achieved better results compared with the most of the existing pipelined methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Wiebe, J., Wilson, T., Cardie, C.: Annotating expressions of opinions and emotions in language. Lang. Resour. Eval. 39(2–3), 165–210 (2005)

    Article  Google Scholar 

  2. Qiu, G., Liu, B., Bu, J., et al.: Opinion word expansion and target extraction through double propagation. Comput. Linguist. 37(1), 9–27 (2011)

    Article  Google Scholar 

  3. Liu, K., Xu, L., Zhao, J.: Extracting opinion targets and opinion words from online reviews with graph co-ranking. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 314–324 (2014)

    Google Scholar 

  4. Culotta, A., Sorensen, J.: Dependency tree kernels for relation extraction. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, p. 423. Association for Computational Linguistics (2004)

    Google Scholar 

  5. Stevenson, M., Greenwood, M.A.: Dependency pattern models for information extraction. Res. Lang. Comput. 7(1), 13 (2009)

    Article  Google Scholar 

  6. Yang, B., Cardie, C.: Joint inference for fine-grained opinion extraction. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1640–1649 (2013)

    Google Scholar 

  7. Liu, P., Joty, S., Meng, H.: Fine-grained opinion mining with recurrent neural networks and word embeddings. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1433–1443 (2015)

    Google Scholar 

  8. Poria, S., Cambria, E., Gelbukh, A.: Aspect extraction for opinion mining with a deep convolutional neural network. Knowl.-Based Syst. 108, 42–49 (2016)

    Article  Google Scholar 

  9. Yao, Y., Huang, Z.: Bi-directional LSTM recurrent neural network for Chinese word segmentation. In: International Conference on Neural Information Processing, pp. 345–353. Springer, Cham (2016)

    Chapter  Google Scholar 

  10. Devlin, J., Chang, M.W., Lee, K., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  11. Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)

  12. Chiu, J.P.C., Nichols, E.: Named entity recognition with bidirectional LSTM-CNNs. Trans. Assoc. Comput. Linguist. 4, 357–370 (2016)

    Article  Google Scholar 

  13. Hu, Z., Ma, X., Liu, Z., et al.: Harnessing deep neural networks with logic rules. arXiv preprint arXiv:1603.06318 (2016)

  14. Yao, K., Peng, B., Zhang, Y., et al.: Spoken language understanding using long short-term memory neural networks. In: 2014 IEEE Spoken Language Technology Workshop (SLT), pp. 189–194. IEEE (2014)

    Google Scholar 

  15. Lample, G., Ballesteros, M., Subramanian, S., et al.: Neural architectures for named entity recognition. arXiv preprint arXiv:1603.01360 (2016)

  16. Zheng, S., Wang, F., Bao, H., et al.: Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:1706.05075 (2017)

  17. Gao, S., Young, M.T., Qiu, J.X., et al.: Hierarchical attention networks for information extraction from cancer pathology reports. J. Am. Med. Inform. Assoc. 25(3), 321–330 (2018)

    Article  Google Scholar 

  18. Ding, R., Li, Z.: Event extraction with deep contextualized word representation and multi-attention layer. In: Gan, G., Li, B., Li, X., Wang, S. (eds.) ADMA 2018. LNCS (LNAI), vol. 11323, pp. 189–201. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-05090-0_17

    Chapter  Google Scholar 

  19. GitHub. https://github.com/google-research/bert#pre-trained-models. Accessed 25 Dec 2018

  20. Al-Badrashiny, M., Bolton, J., Chaganty, A.T., et al.: TinkerBell: cross-lingual cold-start knowledge base construction. In: TAC (2017)

    Google Scholar 

  21. Huang, L., Sil, A., Ji, H., et al.: Improving slot filling performance with attentive neural networks on dependency structures. arXiv preprint arXiv:1707.01075 (2017)

  22. Pontiki, M., Galanis, D., Papageorgiou, H., et al.: SemEval-2016 task 5: aspect based sentiment analysis. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pp. 19–30 (2016)

    Google Scholar 

  23. Ma, X., Hovy, E.: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF. arXiv preprint arXiv:1603.01354 (2016)

  24. Collobert, R., Weston, J., Bottou. L., et al.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(Aug), 2493–2537 (2011)

    Google Scholar 

Download references

Acknowledgments

We are grateful to the anonymous reviewers for their insightful comments and suggestions to improve the paper. This research is financially supported by The National Key Research and Development Program of China (No. 2018YFC0704306, No. 2017YFB0803301, No. 2018YFC0704304).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Quanchao Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Q., Hu, Y. (2019). Joint Extraction of Opinion Targets and Opinion Expressions Based on Cascaded Model. In: Nayak, A., Sharma, A. (eds) PRICAI 2019: Trends in Artificial Intelligence. PRICAI 2019. Lecture Notes in Computer Science(), vol 11672. Springer, Cham. https://doi.org/10.1007/978-3-030-29894-4_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29894-4_44

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29893-7

  • Online ISBN: 978-3-030-29894-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics