Skip to main content

Learning Word Sentiment with Neural Bag-Of-Words Model Combined with Ngram

  • Conference paper
  • First Online:
Intelligence Science II (ICIS 2018)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 539))

Included in the following conference series:

  • 1051 Accesses

Abstract

To better analyze the sentiment, attitude, emotions of users from written language, it is necessary to identify the sentiment polarity of each word not only the overall sentiment (positive/neutral/negative) of a given text. In this paper we propose a novel approach by using a method based on Neural Bag-Of-Words (NBOW) model combined with Ngram, aiming at achieving a good classification score on short text which contain less than 200 words along with sentiment polarity of each word. In order to verify the proposed methodology, we evaluated the classification accuracy and visualize the sentiment polarity of each word extracted from the model, the data set of our experiment only have the sentiment label for each sentence, and there is no information about the sentiment of each word. Experimental result shows that the proposed model can not only correctly classify the sentence polarity but also the sentiment of each word can be successfully captured.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.kaggle.com/PromptCloudHQ/amazon-reviews-unlocked-mobile-phones/data.

  2. 2.

    https://www.kaggle.com/c/twitter-airlines-sentiment-analysis/data.

  3. 3.

    https://github.com/JingChunzhen/sentiment_analysis/tree/master/nbow.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. Computer Science (2014)

    Google Scholar 

  2. Bai, Y., et al.: Bag-of-words based deep neural network for image retrieval. In: ACM International Conference on Multimedia, pp. 229–232 (2014)

    Google Scholar 

  3. Kandola, J., et al.: A neural probabilistic language model. In: Holmes, D.E., Jain, L.C. (eds.) Innovations in Machine Learning. Springer, Berlin Heidelberg (2006). https://doi.org/10.1007/3-540-33486-6_6

    Chapter  Google Scholar 

  4. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. Computer Science (2014)

    Google Scholar 

  5. Chung, J., Gulcehre, C., Cho, K.H., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling, Eprint arXiv (2014)

    Google Scholar 

  6. Collobert, R., Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: International Conference on Machine Learning, pp. 160–167 (2008)

    Google Scholar 

  7. Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Ke, X.: Adaptive recursive neural network for target-dependent twitter sentiment classification. In: Meeting of the Association for Computational Linguistics, pp. 49–54 (2014)

    Google Scholar 

  8. Goldberg, Y.: A primer on neural network models for natural language processing. Computer Science (2015)

    Google Scholar 

  9. Hofmann, T.: Probabilistic latent semantic analysis. In: Proceedings of Uncertainty in Artificial Intelligence, vol. 41, no. 6, pp. 289–296 (2013)

    Google Scholar 

  10. Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv (2014)

    Google Scholar 

  11. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. Computer Science (2014)

    Google Scholar 

  12. Le, Q.V., Mikolov, T.: Distributed representations of sentences and documents, vol. 4, p. II–1188 (2014)

    Google Scholar 

  13. Li, Q., Shah, S., Fang, R., Nourbakhsh, A., Liu, X.: Tweet sentiment analysis by incorporating sentiment-specific word embedding and weighted text features. In: IEEE/WIC/ACM International Conference on Web Intelligence, pp. 568–571 (2017)

    Google Scholar 

  14. Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Ng, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 142–150 (2011)

    Google Scholar 

  15. Mikolov, T., Yih, W.T., Zweig, G.: Linguistic regularities in continuous space word representations. In: HLT-NAACL (2013)

    Google Scholar 

  16. Mikolov, T., Joulin, A., Chopra, S., Mathieu, M., Ranzato, M.A.: Learning longer memory in recurrent neural networks. Computer Science (2014)

    Google Scholar 

  17. Mikolov, T., Karafiát, M. Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: INTERSPEECH 2010, Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, pp. 1045–1048, September 2010

    Google Scholar 

  18. Nadeem, M.: Survey on Opinion Mining and Sentiment Analysis. Springer, New York (2015)

    Google Scholar 

  19. Nasukawa, T., Yi, J.: Sentiment analysis: capturing favorability using natural language processing. In: International Conference on Knowledge Capture, pp. 70–77 (2003)

    Google Scholar 

  20. Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends Inf. Retr. 2(1–2), 1–135 (2008)

    Article  Google Scholar 

  21. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Conference on Empirical Methods in Natural Language Processing, pp. 1532–1543 (2014)

    Google Scholar 

  22. Prechelt, L.: Automatic early stopping using cross validation: quantifying the criteria. Neural Netw. Off. J. Int. Neural Netw. Soc. 11(4), 761 (1998)

    Article  Google Scholar 

  23. Sheikh, I., Illina, I., Fohr, D., Linarès, G.: Learning word importance with the neural bag-of-words model. In: The Workshop on Representation Learning for NLP, pp. 222–229 (2016)

    Google Scholar 

  24. Ma, S., Sun, X., Wang, Y., Lin, J.: Bag-of-words as target for neural machine translation. ACL (2018)

    Google Scholar 

  25. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  26. Wang, P., et al.: Semantic clustering and convolutional neural network for short text categorization (2015)

    Google Scholar 

  27. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2017)

    Google Scholar 

  28. Zaremba, W., Sutskever, I., Vinyals, O.: Recurrent neural network regularization. Eprint arXiv (2014)

    Google Scholar 

  29. Zeiler, M.D., Fergus, R.: Stochastic pooling for regularization of deep convolutional neural networks. Eprint arXiv (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Chunzhen Jing , Jian Li or Xiuyu Duan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jing, C., Li, J., Duan, X. (2018). Learning Word Sentiment with Neural Bag-Of-Words Model Combined with Ngram. In: Shi, Z., Pennartz, C., Huang, T. (eds) Intelligence Science II. ICIS 2018. IFIP Advances in Information and Communication Technology, vol 539. Springer, Cham. https://doi.org/10.1007/978-3-030-01313-4_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01313-4_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01312-7

  • Online ISBN: 978-3-030-01313-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics