Deep Learning Methods in Natural Language Processing

  • Alexis Stalin Alulema FloresEmail author
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1194)


The purpose of this paper is to make a concise description of the current deep learning methods for natural language processing (NLP) and discusses their advantages and disadvantages. The research further discusses the applicability of each deep learning method in the context of natural language processing. Additionally, a series of significant advances that have driven the processing, understanding, and generation of natural language are also discussed.


Natural language processing (NLP) Deep learning Neural networks 


  1. 1.
  2. 2.
    McCormick, C.: Word2vec Tutorial - The Skip-Gram Model.
  3. 3.
    Olah, C.: Understanding LSTM Networks.
  4. 4.
    Olah, C.: Neural Networks, Types, and Functional Programming (2015).
  5. 5.
    Prabhu, R.: Understanding of Convolutional Neural Network (CNN) – Deep Learning (2018).
  6. 6.
    Sugomori, Y., Kaluža, B., Soares, F.M., Souza, A.M.F.: Deep Learning - Practical Neural Networks with Java. Packt Publishing, Birmingham (2017). Chapter 4Google Scholar
  7. 7.
    Britz, D.: Understanding Convolutional Neural Network for NLP.
  8. 8.
    Jurafsky, D., Martin, J.H.: N-gram Language Models (2019).
  9. 9.
    Howard, J., Ruder, S.: Introducing state of the art text classification with universal language models.
  10. 10.
    Eisenschlos, J., Ruder, S., Czapla, P., Kardas, M.: Efficient multi-lingual language model fine-tuning.
  11. 11.
    Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT - Pre-training of Deep Bidirectional Transformers for Language Understanding. Google AI (2019)Google Scholar
  12. 12.
    Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language Models are Unsupervised Multitask Learners (2019).
  13. 13.
    Abigail: Taming Recurrent Neural Networks for Better Summarization (2017).
  14. 14.
    Thiruvengadam, A.: Transformer Architecture: Attention Is All You Need (2018).
  15. 15.
    Rudes, S.: Tracking Progress in Natural Language Processing (2019).
  16. 16.
    Saravia, E.: Deep Learning for NLP, An Overview of Recent Trends (2018).

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.University of New MexicoAlbuquerqueUSA
  2. 2.Number8LouisvilleUSA

Personalised recommendations