Advertisement

Augmented sentiment representation by learning context information

  • Hu Han
  • Xuxu Bai
  • Ping Li
Original Article
  • 10 Downloads

Abstract

Identifying sentiment polarity of a document is a building block of sentiment analysis and natural language processing tasks, and it aims to automate the prediction of a user’s sentiment orientation in the document about a product, on assumption that the document expresses a sentiment on a single product. In general, supervised machine learning models like support vector machine and recently fast-growing deep neural networks method have been extensively used as a sentiment learning approach. Although some neural network-based models learn text features without feature engineering, most of them only focus on extracting semantic representations from single words and rarely consider the contexts attributed to the correlation between words and sentences. In this paper, we propose a novel neural network model to capture the context information from texts. Our model builds a hybrid neural network model using convolutional neural networks and long short-term memory for word context extraction and document representation, respectively. On this basis, user’s and product’s information can be incorporated into the model. The experimental results show the competitive performance of our model, compared to all state-of-the-art methods.

Keywords

Sentiment classification Supervised learning Convolutional neural networks Context information 

Notes

Acknowledgements

This work was supported by the National Social Science Foundation of China (No. 17BXW071) and the Technology Program of Lanzhou Science and Technology Bureau (No. 214162). P. Li acknowledge SWPU Innovation Team “Data Intelligence” Funding (No. 2015CXTD06) and NFSC (No. 81373531).

Compliance with ethical standards

Conflict of interest

We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

References

  1. 1.
    Pang B, Lee L (2008) Opinion mining and sentiment analysis. Found Trends Inf Retrieval 2(12):1–135Google Scholar
  2. 2.
    Pang B, Lee L (2002) Thumbs up? Sentiment classification using machine learning techniques. In: EMNLP, pp 79–86Google Scholar
  3. 3.
    Cambria E (2011) Affective computing and sentiment analysis, no. 2, vol 45. Springer, Netherlands, pp 102–107Google Scholar
  4. 4.
    Hatzivassiloglou V, McKeown K (1997) Predicting the semantic orientation of adjectives. In: Proceedings of ACLGoogle Scholar
  5. 5.
    Taboada M, Brooke J, Tofiloski M, Voll K, Stede M (2011) Lexicon-based methods for lexicon-based methods for sentiment analysis. In: ACLGoogle Scholar
  6. 6.
    Zhang H, Cao X, H J, C T (2017) Object-level video advertising: an optimization framework. IEEE Trans Ind Inform 13(2):520–531CrossRefGoogle Scholar
  7. 7.
    Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: International conference on neural information processing systems, pp 1097–1105Google Scholar
  8. 8.
    Amodei D, Anubhai R, Battenberg E, Case C, Casper J, Catanzaro B, Chen J, Chrzanowski M, Coates A, Diamos G (2015) Deep speech 2: end-to-end speech recognition in English and Mandarin. Comput Sci 127:123–132Google Scholar
  9. 9.
    Zadeh A, Chen M, Poria S, Cambria E, Morency L-P (2017) Tensor fusion network for multimodal sentiment analysis. In: EMNLP, pp 1103–1114Google Scholar
  10. 10.
    Zhang H, Li J, J Y, Y H (2017) Understanding subtitles by character-level sequence-to-sequence learning. IEEE Trans Ind Inform 13(2):616–624CrossRefGoogle Scholar
  11. 11.
    Kim Y (2014) Convolutional neural networks for sentence classification. In: Proceedings of EMNLPGoogle Scholar
  12. 12.
    Socher R, Pennington J, Huang EH, Ng AY, Manning CD (2011) Semi-supervised recursive autoencoders for predicting sentiment distributions. In: Proceedings of EMNLP, pp 151–161Google Scholar
  13. 13.
    Tang D, Qin B, Liu T (2015) Learning semantic representations of users and products for document level sentiment classification. In: Proceedings of ACLGoogle Scholar
  14. 14.
    Chen H, Sun M, Tu C, Lin Y, Liu Z (2016) Neural sentiment classification with user and product attention. In: proceedings of EMNLPGoogle Scholar
  15. 15.
    Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781
  16. 16.
    Maas AL, Daly RE, Pham PT, Huang D, Ng AY, Potts C (2011) Learning word vectors for sentiment analysis. In: Proceedings of ACL, pp 142–150Google Scholar
  17. 17.
    Tang D, Qin B, Liu T (2015) Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of EMNLP, pp 1422–1432Google Scholar
  18. 18.
    Tang D, Wei F, Yang N, Zhou M, Liu T, Qin B (2014) Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of ACL, pp 1555–1565Google Scholar
  19. 19.
    Labutov I, Lipson H (2013) Re-embedding words. In: Annual meeting of the association for computational linguisticsGoogle Scholar
  20. 20.
    Johnson R, Zhang T (2014) Effective use of word order for text categorization with convolutional neural networks. arXiv preprint arXiv:1412.1058
  21. 21.
    Zhang X, LeCun Y (2015) Text understanding from scratch. arXiv:1502.01710v5
  22. 22.
    Bhatia P, Ji Y, Eisenstein J (2015) Better document-level sentiment analysis from RST discourse parsing. In: Proceedings of EMNLPGoogle Scholar
  23. 23.
    Yang Z, Yang D, Dyer C, He X, Smola A, Hovy E (2016) Hierarchical attention networks for document classification. In: Proceedings NAACLGoogle Scholar
  24. 24.
    Shi B, Bai X, Yao C (2015) An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition. IEEE Trans Pattern Anal Mach Intell 39(11):2298–2304CrossRefGoogle Scholar
  25. 25.
    Frege G (1892) On sense and reference. In: LudlowGoogle Scholar
  26. 26.
    Lipton ZC, Kale DC, Elkan C, Wetzell R (2015) Learning to diagnose with lstm recurrent neural networks. arXiv preprint arXiv:1511.03677
  27. 27.
    Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv:1412.6980 [cs.LG]
  28. 28.
    Le QV, Mikolov T (2014) Distributed representations of sentences and documents. In: Proceedings of ICMLGoogle Scholar

Copyright information

© The Natural Computing Applications Forum 2018

Authors and Affiliations

  1. 1.School of Electronic and Information EngineeringLanzhou Jiaotong UniversityLanzhouChina
  2. 2.Center for Intelligent and Networked Systems, School of Computer ScienceSouthwest Petroleum UniversityChengduChina

Personalised recommendations