Abstract
The recent researches demonstrate the methods of generating text headline can tackle the problem of information overload. However, the popular neural network based framework, namely, encoder-decoder framework is difficult to handle the headline generation task of long source text. In this paper, we establish a hierarchical attention headline generation model with a filter to solve this problem. This model first relies on a filter to extract crucial contents of the source text, and then the hierarchical attention mechanism accurately identifies important words. Finally, our model generates a high-quality headline. Experimental results show that the ROUGE scores of our model are higher than those of classical models. Furthermore, our model achieves better performance than classical model in dealing with long text.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Yao, J., Wan, X., Xiao, J.: Recent advances in document summarization. Knowl. Inf. Syst. 53(2), 297–336 (2017)
Yadav, J., Meena, Y.K.: Use of fuzzy logic and wordnet for improving performance of extractive automatic text summarization. In: Advances in Computing, Communications and Informatics, pp. 2071–2077. IEEE (2016)
Zenkert, J., Klahold, A., Fathi, M.: Towards extractive text summarization using multidimensional knowledge representation. In: IEEE International Conference on Electro/Information Technology, pp. 0826–0831. IEEE (2018)
Bing, L., Li, P., Liao, Y., Lam, W., Guo, W., Passonneau, R.J.: Abstractive multi-document summarization via phrase selection and merging. In: The 53rd Annual Meeting of the Association for Computational Linguistics, pp. 1587–1597 (2015)
Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: The Conference on Empirical Methods in Natural Language Processing, pp. 379–389 (2015)
Tan, J., Wan, X., Xiao, J.: From neural sentence summarization to headline generation: a coarse-to-fine approach. In: The 26th International Joint Conference on Artificial Intelligence, pp. 4109–4115. AAAI Press (2017)
Yao, K., Zhang, L., Du, D., Luo, T., Tao, L., Wu, Y.: Dual encoding for abstractive text summarization. IEEE Trans. Cybern., 1–12 (2018)
Mihalcea, R., Tarau, P.: Textrank: bringing order into text. In: The Conference on Empirical Methods in Natural Language Processing, pp. 404–411 (2004)
Haghighi, A., Vanderwende, L.: Exploring content models for multi-document summarization. In: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics. Association for Computational Linguistics, pp. 362–370 (2009)
Vanderwende, L., Suzuki, H., Brockett, C., et al.: Beyond SumBasic: task-focused summarization with sentence simplification and lexical expansion. Inf. Process. Manage. 43(6), 1606–1618 (2007)
Cho, K., Van Merriënboer, B., Gulcehre, C., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: The Conference on Empirical Methods in Natural Language Processing, pp. 1724–1734 (2014)
Lin, C.Y.: Looking for a few good metrics: rouge and its evaluation. In: Proceedings of the Evaluation of Information Access Technologies (2004)
Acknowledgments
This work was supported by the National Natural Science Foundation of China [Grant Nos. 61872228].
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Xie, J., Wang, X., Wang, X., Pang, G. (2020). A Hierarchical Attention Headline Generation Model with a Filter. In: Park, J., Yang, L., Jeong, YS., Hao, F. (eds) Advanced Multimedia and Ubiquitous Engineering. MUE FutureTech 2019 2019. Lecture Notes in Electrical Engineering, vol 590. Springer, Singapore. https://doi.org/10.1007/978-981-32-9244-4_47
Download citation
DOI: https://doi.org/10.1007/978-981-32-9244-4_47
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-32-9243-7
Online ISBN: 978-981-32-9244-4
eBook Packages: EngineeringEngineering (R0)