Abstract
In this paper, we conduct a review headline generation task that produces a short headline from a review post by a user. We argue that this task is more challenging than document summarization, because the headlines generated by users vary from person to person. It not only needs to effectively capture the preferences of the users who post the reviews, but also requires to mine the emphasis of the users regarding the review when they write the headlines. To this end, we propose to incorporate the user information as the prior knowledge into the encoder and decoder for general sequence-to-sequence model. Specifically, we introduce user embedding for each user, and then we use these embeddings to initialize the encoder and decoder, or as biases for decoder initialization. We construct a review headline generation dataset, and the experiments on this dataset demonstrate that our models significantly outperform baseline models which do not consider user information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
References
Amir, S., Wallace, B.C., Lyu, H., Carvalho, P., Silva, M.J.: Modelling context with user embeddings for sarcasm detection in social media. In: Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning, pp. 167–177 (2016)
Carenini, G., Cheung, J.C.K., Pauls, A.: Multi-document summarization of evaluative text. Comput. Intell. 29, 545–576 (2013)
Chen, L., Qian, T., Zhu, P., You, Z.: Learning user embedding representation for gender prediction. In: 2016 IEEE 28th International Conference on Tools with Artificial Intelligence (ICTAI), pp. 263–269 (2016)
Chen, W., Zhang, Z., Li, Z., Zhang, M.: Distributed representations for building profiles of users and items from text reviews. In: Proceedings of the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2143–2153 (2016)
Cho, K., van Merrienboer, B., Bahdanau, D., Bengio, Y.: On the properties of neural machine translation: encoder-decoder approaches. In: Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, pp. 103–111 (2014)
Chopra, S., Auli, M., Rush, A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 93–98 (2016)
Gerani, S., Mehdad, Y., Carenini, G., Ng, R.T., Nejat, B.: Abstractive summarization of product reviews using discourse structure. In: Conference on Empirical Methods in Natural Language Processing, pp. 1602–1613 (2014)
Gu, J., Lu, Z., Li, H., Li, V.O.: Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 1631–1640 (2016)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–1780 (1997)
Hu, M., Liu, B.: Mining and summarizing customer reviews. In: Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Seattle, Washington, USA, August. pp. 168–177 (2004)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Li, H., Zhu, J., Zhang, J., Zong, C.: Ensure the correctness of the summary: incorporate entailment knowledge into abstractive sentence summarization. In: Proceedings of the 27th International Conference on Computational Linguistics (2018)
Li, J., Ritter, A., Hovy, E.: Weakly supervised user profile extraction from twitter. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pp. 165–174 (2014)
Li, P., Lam, W., Bing, L., Guo, W., Li, H.: Cascaded attention based unsupervised information distillation for compressive summarization. In: Conference on Empirical Methods in Natural Language Processing, pp. 2081–2090 (2017)
Li, P., Lam, W., Bing, L., Wang, Z.: Deep recurrent generative decoder for abstractive text summarization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 2091–2100 (2017)
Li, Z., Huang, J., Zhong, N.: Exploiting user and item embedding in latent factor models for recommendations. In: Proceedings of the International Conference on Web Intelligence, pp. 1241–1245 (2017)
Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Proceedings of the Workshop on Text Summarization Branches Out, pp. 74–81 (2004)
Nallapati, R., Zhou, B., dos Santos, C., Gulcehre, C., Xiang, B.: Abstractive text summarization using sequence-to-sequence RNNs and beyond. In: Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning, pp. 280–290 (2016)
Nguyen, D.Q., Vu, T., Nguyen, T.D., Phung, D.: A capsule network-based embedding model for search personalization. arXiv preprint arXiv:1804.04266 (2018)
Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: International Conference on Machine Learning, pp. 1310–1318 (2013)
Poussevin, M., Guigue, V., Gallinari, P.: Extended recommendation framework: generating the text of a user review as a personalized summary. arXiv preprint arXiv:1412.5448 (2014)
Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 379–389 (2015)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 1073–1083 (2017)
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)
Takase, S., Suzuki, J., Okazaki, N., Hirao, T., Nagata, M.: Neural headline generation on abstract meaning representation. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 1054–1059 (2016)
Vu, T., Nguyen, D.Q., Johnson, M., Song, D., Willis, A.: Search personalization with embeddings. In: Jose, J.M., et al. (eds.) ECIR 2017. LNCS, vol. 10193, pp. 598–604. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-56608-5_54
Wang, L., Ling, W.: Neural network-based abstract generation for opinions and arguments. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 47–57 (2016)
Xu, S., Yang, S., Lau, F.: Keyword extraction and headline generation using novel word features. In: Twenty-Fourth AAAI Conference on Artificial Intelligence, pp. 1461–1466 (2010)
Yu, Y., Wan, X., Zhou, X.: User embedding for scholarly microblog recommendation. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 449–453 (2016)
Zhou, Q., Yang, N., Wei, F., Zhou, M.: Selective encoding for abstractive sentence summarization. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 1095–1104 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, T., Li, H., Zhu, J., Zhang, J., Zong, C. (2018). Review Headline Generation with User Embedding. In: Sun, M., Liu, T., Wang, X., Liu, Z., Liu, Y. (eds) Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data. CCL NLP-NABD 2018 2018. Lecture Notes in Computer Science(), vol 11221. Springer, Cham. https://doi.org/10.1007/978-3-030-01716-3_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-01716-3_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01715-6
Online ISBN: 978-3-030-01716-3
eBook Packages: Computer ScienceComputer Science (R0)