Abstract
To better analyze the sentiment, attitude, emotions of users from written language, it is necessary to identify the sentiment polarity of each word not only the overall sentiment (positive/neutral/negative) of a given text. In this paper we propose a novel approach by using a method based on Neural Bag-Of-Words (NBOW) model combined with Ngram, aiming at achieving a good classification score on short text which contain less than 200 words along with sentiment polarity of each word. In order to verify the proposed methodology, we evaluated the classification accuracy and visualize the sentiment polarity of each word extracted from the model, the data set of our experiment only have the sentiment label for each sentence, and there is no information about the sentiment of each word. Experimental result shows that the proposed model can not only correctly classify the sentence polarity but also the sentiment of each word can be successfully captured.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. Computer Science (2014)
Bai, Y., et al.: Bag-of-words based deep neural network for image retrieval. In: ACM International Conference on Multimedia, pp. 229–232 (2014)
Kandola, J., et al.: A neural probabilistic language model. In: Holmes, D.E., Jain, L.C. (eds.) Innovations in Machine Learning. Springer, Berlin Heidelberg (2006). https://doi.org/10.1007/3-540-33486-6_6
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. Computer Science (2014)
Chung, J., Gulcehre, C., Cho, K.H., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling, Eprint arXiv (2014)
Collobert, R., Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: International Conference on Machine Learning, pp. 160–167 (2008)
Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Ke, X.: Adaptive recursive neural network for target-dependent twitter sentiment classification. In: Meeting of the Association for Computational Linguistics, pp. 49–54 (2014)
Goldberg, Y.: A primer on neural network models for natural language processing. Computer Science (2015)
Hofmann, T.: Probabilistic latent semantic analysis. In: Proceedings of Uncertainty in Artificial Intelligence, vol. 41, no. 6, pp. 289–296 (2013)
Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv (2014)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. Computer Science (2014)
Le, Q.V., Mikolov, T.: Distributed representations of sentences and documents, vol. 4, p. II–1188 (2014)
Li, Q., Shah, S., Fang, R., Nourbakhsh, A., Liu, X.: Tweet sentiment analysis by incorporating sentiment-specific word embedding and weighted text features. In: IEEE/WIC/ACM International Conference on Web Intelligence, pp. 568–571 (2017)
Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Ng, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 142–150 (2011)
Mikolov, T., Yih, W.T., Zweig, G.: Linguistic regularities in continuous space word representations. In: HLT-NAACL (2013)
Mikolov, T., Joulin, A., Chopra, S., Mathieu, M., Ranzato, M.A.: Learning longer memory in recurrent neural networks. Computer Science (2014)
Mikolov, T., Karafiát, M. Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: INTERSPEECH 2010, Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, pp. 1045–1048, September 2010
Nadeem, M.: Survey on Opinion Mining and Sentiment Analysis. Springer, New York (2015)
Nasukawa, T., Yi, J.: Sentiment analysis: capturing favorability using natural language processing. In: International Conference on Knowledge Capture, pp. 70–77 (2003)
Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends Inf. Retr. 2(1–2), 1–135 (2008)
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Conference on Empirical Methods in Natural Language Processing, pp. 1532–1543 (2014)
Prechelt, L.: Automatic early stopping using cross validation: quantifying the criteria. Neural Netw. Off. J. Int. Neural Netw. Soc. 11(4), 761 (1998)
Sheikh, I., Illina, I., Fohr, D., Linarès, G.: Learning word importance with the neural bag-of-words model. In: The Workshop on Representation Learning for NLP, pp. 222–229 (2016)
Ma, S., Sun, X., Wang, Y., Lin, J.: Bag-of-words as target for neural machine translation. ACL (2018)
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)
Wang, P., et al.: Semantic clustering and convolutional neural network for short text categorization (2015)
Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2017)
Zaremba, W., Sutskever, I., Vinyals, O.: Recurrent neural network regularization. Eprint arXiv (2014)
Zeiler, M.D., Fergus, R.: Stochastic pooling for regularization of deep convolutional neural networks. Eprint arXiv (2013)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 IFIP International Federation for Information Processing
About this paper
Cite this paper
Jing, C., Li, J., Duan, X. (2018). Learning Word Sentiment with Neural Bag-Of-Words Model Combined with Ngram. In: Shi, Z., Pennartz, C., Huang, T. (eds) Intelligence Science II. ICIS 2018. IFIP Advances in Information and Communication Technology, vol 539. Springer, Cham. https://doi.org/10.1007/978-3-030-01313-4_21
Download citation
DOI: https://doi.org/10.1007/978-3-030-01313-4_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01312-7
Online ISBN: 978-3-030-01313-4
eBook Packages: Computer ScienceComputer Science (R0)