Abstract
In view of the shortcomings of language model N-gram, this paper presents a Long Short-Term Memory (LSTM)-based language model based on the advantage that LSTM can theoretically utilize any long sequence of information. It’s an improved RNN model. Experimental results show that the perplexity of the LSTM language model in the PBT corpus is only one-half that of the N-gram language model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Lin, C.Y., Hovy, E.: Automatic evaluation of summaries using N-gram co-occurrence statistics. In: Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology. Association for Computational Linguistics, pp. 71–78 (2003)
Xiong, W., Droppo, J., Huang, X., et al.: Achieving human parity in conversational speech recognition. IEEE/ACM Trans. Audio Speech Lang. Process. PP(99) (2016)
Li, J., Zhang, H., Cai, X.Y., et al.: Towards end-to-end speech recognition for Chinese Mandarin using long short-term memory recurrent neural networks (2015)
Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. Comput. Sci. 5(1), 36 (2015)
Mikolov, T.A.: Statistical language models based on neural networks (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Zhang, Y., Lu, X., Quan, B., Wei, Y. (2019). A Proposed Language Model Based on LSTM. In: Li, B., Yang, M., Yuan, H., Yan, Z. (eds) IoT as a Service. IoTaaS 2018. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 271. Springer, Cham. https://doi.org/10.1007/978-3-030-14657-3_35
Download citation
DOI: https://doi.org/10.1007/978-3-030-14657-3_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-14656-6
Online ISBN: 978-3-030-14657-3
eBook Packages: Computer ScienceComputer Science (R0)