Advertisement

Aspect-Based Sentiment Analysis of Nuclear Energy Tweets with Attentive Deep Neural Network

  • Zhengyuan Liu
  • Jin-Cheon NaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11279)

Abstract

Opinion mining of social networking sites like Facebook and Twitter plays an important role in exploring valuable online user-generated contents. In contrast to sentence-level sentiment classification, the aspect-based analysis which can infer polarities towards various aspects in one sentence could obtain more in-depth insight. However, in traditional machine learning approaches, training such a fine-grained model often needs certain manual feature engineering. In this article, we proposed a deep learning model for aspect-level sentiment analysis and applied it to nuclear energy related tweets for understanding public opinions towards nuclear energy. We also built a new dataset for this task and the evaluation results showed that our attentive neural network could obtain insightful inference in rather complex expression forms and achieve state-of-the-art performance.

Keywords

Deep learning Sentiment analysis Social network Natural language processing 

References

  1. 1.
    Tumasjan, A., Sprenger, T.O., Sandner, P.G., Welpe, I.M.: Predicting elections with Twitter: what 140 characters reveal about political sentiment. Icwsm 10(1), 178–185 (2010)Google Scholar
  2. 2.
    Sakaki, T., Okazaki, M., Matsuo, Y.: Earthquake shakes Twitter users: real-time event detection by social sensors. In: Proceedings of the 19th International Conference on World Wide Web, pp. 851–860. ACM (2010)Google Scholar
  3. 3.
    Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5(1), 1–167 (2012)CrossRefGoogle Scholar
  4. 4.
    Pontiki, M., Galanis, D., Pavlopoulos, J., Papageorgiou, H., Androutsopoulos, I., Manandhar, S.: SemEval-2014 task 4: aspect based sentiment analysis. In: Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014), pp. 27–35 (2014)Google Scholar
  5. 5.
    Mohammad, S.M., Kiritchenko, S., Zhu, X.: NRC-Canada: building the state-of-the-art in sentiment analysis of tweets. arXiv preprint arXiv:1308.6242 (2013)
  6. 6.
    Thet, T.T., Na, J.C., Khoo, C.S.: Aspect-based sentiment analysis of movie reviews on discussion boards. J. Inf. Sci. 36(6), 823–848 (2010)CrossRefGoogle Scholar
  7. 7.
    Dzmitry, B., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
  8. 8.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)Google Scholar
  9. 9.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  10. 10.
    Rocktäschel, T., Grefenstette, E., Hermann, K.M., Kočiský, T., Blunsom, P.: Reasoning about entailment with neural attention. arXiv preprint arXiv:1509.06664 (2015)
  11. 11.
    Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015)
  12. 12.
    Sukhbaatar, S., Weston, J., Fergus, R.: End-to-end memory networks. In: Advances in Neural Information Processing Systems, pp. 2440–2448 (2015)Google Scholar
  13. 13.
    Wang, Y., Huang, M., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)Google Scholar
  14. 14.
    Kim, D.S., Kim, J.W.: Public opinion sensing and trend analysis on social media: a study on nuclear power on Twitter. Int. J. Multimed. Ubiquitous Eng. 9(11), 373–384 (2014)CrossRefGoogle Scholar
  15. 15.
    Kumar, A., et al.: Ask me anything: dynamic memory networks for natural language processing. In: International Conference on Machine Learning, pp. 1378–1387 (2016)Google Scholar
  16. 16.
    Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)Google Scholar
  17. 17.
    Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)
  18. 18.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Wee Kim Wee School of Communication and InformationNanyang Technological UniversitySingaporeSingapore

Personalised recommendations