Advertisement

BAB-QA: A New Neural Model for Emotion Detection in Multi-party Dialogue

  • Zilong Wang
  • Zhaohong Wan
  • Xiaojun WanEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11439)

Abstract

In this paper, we propose a new neural model BAB-QA to address the task of emotion detection in multi-party dialogues, which aims to detect emotion for each utterance in a dialogue among four label candidates: joy, sadness, anger, and neutral. A variety of models have been proposed to solve this task, but few of them manage to capture contextual information in a dialogue properly. Therefore, we adopt a Bi-directional Long Short-Term Memory network (BiLSTM) and an attention network to obtain representations of sentences and then apply a contextualization network to refine the sentence representations for classification. More importantly, we propose and incorporate a new module called QA network in our model, which is inspired by natural language inference tasks. This QA network enables our model to acquire better sentence encodings by modeling adjacent sentences in a dialogue as question-answer pairs. We evaluate our model in the benchmark EmotionX datasets provided by SocialNLP2018 and our model achieves the state-of-the-art performance.

Keywords

Emotion detection Multi-party dialogue Neural model 

Notes

Acknowledgments

This work was supported by National Natural Science Foundation of China (61772036, 61331011) and Key Laboratory of Science, Technology and Standard in Press Industry (Key Laboratory of Intelligent Press Media Technology). We thank the anonymous reviewers for their helpful comments.

References

  1. 1.
    Abdul-Mageed, M., Ungar, L.: EmoNet: fine-grained emotion detection with gated recurrent neural networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 718–728 (2017)Google Scholar
  2. 2.
    Baziotis, C., et al.: NTUA-SLP at SemEval-2018 task 1: predicting affective content in tweets with deep attentive RNNs and transfer learning. In: Proceedings of the 12th International Workshop on Semantic Evaluation, pp. 245–255 (2018)Google Scholar
  3. 3.
    Bowman, S.R., Angeli, G., Potts, C., Manning, C.D.: A large annotated corpus for learning natural language inference. arXiv preprint arXiv:1508.05326 (2015)
  4. 4.
    Chen, S.Y., Hsu, C.C., Kuo, C.C., Ku, L.W., et al.: Emotionlines: an emotion corpus of multi-party conversations. arXiv preprint arXiv:1802.08379 (2018)
  5. 5.
    Chen, Y., Yuan, J., You, Q., Luo, J.: Twitter sentiment analysis via bi-sense emoji embedding and attention-based LSTM. arXiv preprint arXiv:1807.07961 (2018)
  6. 6.
    Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
  7. 7.
    Hasan, M., Rundensteiner, E., Agu, E.: EMOTEX: detecting emotions in Twitter messages (2014)Google Scholar
  8. 8.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  9. 9.
    Hsu, C.C., Ku, L.W.: SocialNLP 2018 emotionX challenge overview: recognizing emotions in dialogues. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 27–31 (2018)Google Scholar
  10. 10.
    Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014, Article ID 627892, 10 p. (2014).  https://doi.org/10.1155/2014/627892
  11. 11.
    Khosla, S.: EmotionX-AR: CNN-DCNN autoencoder based emotion classifier. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 37–44 (2018)Google Scholar
  12. 12.
    Kim, Y., Lee, H., Jung, K.: AttnConvnet at SemEval-2018 task 1: attention-based convolutional neural networks for multi-label emotion classification. arXiv preprint arXiv:1804.00831 (2018)
  13. 13.
    Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
  14. 14.
    Lei, Z., Yang, Y., Yang, M., Liu, Y.: A multi-sentiment-resource enhanced attention network for sentiment classification. arXiv preprint arXiv:1807.04990 (2018)
  15. 15.
    Liew, J.S.Y., Turtle, H.R.: Exploring fine-grained emotion detection in tweets. In: Proceedings of the NAACL Student Research Workshop, pp. 73–80 (2016)Google Scholar
  16. 16.
    Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101 (2016)
  17. 17.
    Luo, L., Yang, H., Chin, F.Y.: EmotionX-DLC: self-attentive BiLSTM for detecting sequential emotions in dialogue. arXiv preprint arXiv:1806.07039 (2018)
  18. 18.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp. 3111–3119 (2013)Google Scholar
  19. 19.
    Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)Google Scholar
  20. 20.
    Rozental, A., Fleischer, D., Kelrich, Z.: Amobee at IEST 2018: transfer learning from language models. arXiv:1808.08782
  21. 21.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  22. 22.
    Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)Google Scholar
  23. 23.
    Yang, Y., et al.: Learning semantic textual similarity from conversations. arXiv preprint arXiv:1804.07754 (2018)
  24. 24.
    Zhang, Q., Chen, X., Zhan, Q., Yang, T., Xia, S.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92, 84–90 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Institute of Computer Science and TechnologyPeking UniversityBeijingChina
  2. 2.The MOE Key Laboratory of Computational LinguisticsPeking UniversityBeijingChina

Personalised recommendations