Skip to main content

BAB-QA: A New Neural Model for Emotion Detection in Multi-party Dialogue

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11439))

Included in the following conference series:

Abstract

In this paper, we propose a new neural model BAB-QA to address the task of emotion detection in multi-party dialogues, which aims to detect emotion for each utterance in a dialogue among four label candidates: joy, sadness, anger, and neutral. A variety of models have been proposed to solve this task, but few of them manage to capture contextual information in a dialogue properly. Therefore, we adopt a Bi-directional Long Short-Term Memory network (BiLSTM) and an attention network to obtain representations of sentences and then apply a contextualization network to refine the sentence representations for classification. More importantly, we propose and incorporate a new module called QA network in our model, which is inspired by natural language inference tasks. This QA network enables our model to acquire better sentence encodings by modeling adjacent sentences in a dialogue as question-answer pairs. We evaluate our model in the benchmark EmotionX datasets provided by SocialNLP2018 and our model achieves the state-of-the-art performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    SocialNLP2018 Workshop Challenge: http://doraemon.iis.sinica.edu.tw/emotionlines/challenge.html.

References

  1. Abdul-Mageed, M., Ungar, L.: EmoNet: fine-grained emotion detection with gated recurrent neural networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 718–728 (2017)

    Google Scholar 

  2. Baziotis, C., et al.: NTUA-SLP at SemEval-2018 task 1: predicting affective content in tweets with deep attentive RNNs and transfer learning. In: Proceedings of the 12th International Workshop on Semantic Evaluation, pp. 245–255 (2018)

    Google Scholar 

  3. Bowman, S.R., Angeli, G., Potts, C., Manning, C.D.: A large annotated corpus for learning natural language inference. arXiv preprint arXiv:1508.05326 (2015)

  4. Chen, S.Y., Hsu, C.C., Kuo, C.C., Ku, L.W., et al.: Emotionlines: an emotion corpus of multi-party conversations. arXiv preprint arXiv:1802.08379 (2018)

  5. Chen, Y., Yuan, J., You, Q., Luo, J.: Twitter sentiment analysis via bi-sense emoji embedding and attention-based LSTM. arXiv preprint arXiv:1807.07961 (2018)

  6. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  7. Hasan, M., Rundensteiner, E., Agu, E.: EMOTEX: detecting emotions in Twitter messages (2014)

    Google Scholar 

  8. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  9. Hsu, C.C., Ku, L.W.: SocialNLP 2018 emotionX challenge overview: recognizing emotions in dialogues. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 27–31 (2018)

    Google Scholar 

  10. Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014, Article ID 627892, 10 p. (2014). https://doi.org/10.1155/2014/627892

  11. Khosla, S.: EmotionX-AR: CNN-DCNN autoencoder based emotion classifier. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 37–44 (2018)

    Google Scholar 

  12. Kim, Y., Lee, H., Jung, K.: AttnConvnet at SemEval-2018 task 1: attention-based convolutional neural networks for multi-label emotion classification. arXiv preprint arXiv:1804.00831 (2018)

  13. Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)

  14. Lei, Z., Yang, Y., Yang, M., Liu, Y.: A multi-sentiment-resource enhanced attention network for sentiment classification. arXiv preprint arXiv:1807.04990 (2018)

  15. Liew, J.S.Y., Turtle, H.R.: Exploring fine-grained emotion detection in tweets. In: Proceedings of the NAACL Student Research Workshop, pp. 73–80 (2016)

    Google Scholar 

  16. Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101 (2016)

  17. Luo, L., Yang, H., Chin, F.Y.: EmotionX-DLC: self-attentive BiLSTM for detecting sequential emotions in dialogue. arXiv preprint arXiv:1806.07039 (2018)

  18. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp. 3111–3119 (2013)

    Google Scholar 

  19. Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  20. Rozental, A., Fleischer, D., Kelrich, Z.: Amobee at IEST 2018: transfer learning from language models. arXiv:1808.08782

  21. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  22. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  23. Yang, Y., et al.: Learning semantic textual similarity from conversations. arXiv preprint arXiv:1804.07754 (2018)

  24. Zhang, Q., Chen, X., Zhan, Q., Yang, T., Xia, S.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92, 84–90 (2017)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by National Natural Science Foundation of China (61772036, 61331011) and Key Laboratory of Science, Technology and Standard in Press Industry (Key Laboratory of Intelligent Press Media Technology). We thank the anonymous reviewers for their helpful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaojun Wan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Z., Wan, Z., Wan, X. (2019). BAB-QA: A New Neural Model for Emotion Detection in Multi-party Dialogue. In: Yang, Q., Zhou, ZH., Gong, Z., Zhang, ML., Huang, SJ. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2019. Lecture Notes in Computer Science(), vol 11439. Springer, Cham. https://doi.org/10.1007/978-3-030-16148-4_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-16148-4_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-16147-7

  • Online ISBN: 978-3-030-16148-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics