Skip to main content

DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT

  • Conference paper
  • First Online:
Advanced Parallel Processing Technologies (APPT 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11719))

Included in the following conference series:

Abstract

With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classification is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sentiment analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we propose a neural network model that combines deep-attention with Bidirectional Encoder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational dependencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average where the word vector is 300 dimensions in aspect sentiment classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5(1), 1–167 (2012)

    Article  Google Scholar 

  2. Yang, Z., Yang, D., Dyer, C., He, X.: Hierarchical attention networks for document classification. In: Proceedings of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2017)

    Google Scholar 

  3. Bhatia, P., Ji, Y., Eisenstein, J.: Better document-level sentiment analysis from RST discourse parsing. Comput. Sci. 2212–2218 (2015)

    Google Scholar 

  4. Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistic and the 7th International Joint Conference on Natural Language Processing, pp. 1556–1566 (2015)

    Google Scholar 

  5. Zhu, X., Sobhani, P., Guo, H.: Long short-term memory over tree structures. In: Proceedings of the 32nd International Conference on Machine Learning, 1604–1612 (2015)

    Google Scholar 

  6. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of International Conference on Learning Representations, pp. 940–1000 (2015)

    Google Scholar 

  7. Wang, Y., Huang, M., Zhao, L., Zhu, X.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)

    Google Scholar 

  8. Dohaiha, H.H., Prasad, P.W.C., Maag, A., Alsadoon, A.: Deep learning for aspect-based sentiment analysis: a comparative review. Expert Syst. Appl. 118, 272–299 (2019)

    Article  Google Scholar 

  9. Wang, B., Liu, M.: Deep learning for aspect-based sentiment analysis [RT]. Stanford University report (2015). http://cs224d.stanford.edu/reports/WangBo.pdf

  10. Tang, D., Qin, B., Feng, X.: Effective LSTMs for target-dependent sentiment classification. In: Proceedings of COLINE 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 3298–3307 (2016)

    Google Scholar 

  11. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  12. Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013)

  13. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, pp. 3104–3112 (2014)

    Google Scholar 

  14. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  15. Pei, S., Wang, L.: Study on text sentiment analysis using attention mechanism. Comput. Eng. Sci. 2, 343–354 (2019). (in Chinese)

    Google Scholar 

  16. Lin, Z., et al.: A structured self-attentive sentence embedding. arXiv preprint arXiv:1703.03130 (2017)

Download references

Acknowledgements

We would like to thank the anonymous reviewers for their invaluable comments. This work was partially funded by the Shanghai Pujiang Program under Grant 16PJ1407600, the China Post-Doctoral Science Foundation under Grant 2017M610230, and the National Natural Science Foundation of China under Grant 61332009, 61775139, and the Open Project Funding from the State Key Lab of Computer Architecture, ICT, CAS under Grant CARCH201807. Any opinions, findings and conclusions expressed in this paper are those of the authors and do not necessarily reflect the views of the sponsors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Songwen Pei .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pei, S., Wang, L., Shen, T., Ning, Z. (2019). DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT. In: Yew, PC., Stenström, P., Wu, J., Gong, X., Li, T. (eds) Advanced Parallel Processing Technologies. APPT 2019. Lecture Notes in Computer Science(), vol 11719. Springer, Cham. https://doi.org/10.1007/978-3-030-29611-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29611-7_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29610-0

  • Online ISBN: 978-3-030-29611-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics