Abstract
In this paper, we propose a new neural model BAB-QA to address the task of emotion detection in multi-party dialogues, which aims to detect emotion for each utterance in a dialogue among four label candidates: joy, sadness, anger, and neutral. A variety of models have been proposed to solve this task, but few of them manage to capture contextual information in a dialogue properly. Therefore, we adopt a Bi-directional Long Short-Term Memory network (BiLSTM) and an attention network to obtain representations of sentences and then apply a contextualization network to refine the sentence representations for classification. More importantly, we propose and incorporate a new module called QA network in our model, which is inspired by natural language inference tasks. This QA network enables our model to acquire better sentence encodings by modeling adjacent sentences in a dialogue as question-answer pairs. We evaluate our model in the benchmark EmotionX datasets provided by SocialNLP2018 and our model achieves the state-of-the-art performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
SocialNLP2018 Workshop Challenge: http://doraemon.iis.sinica.edu.tw/emotionlines/challenge.html.
References
Abdul-Mageed, M., Ungar, L.: EmoNet: fine-grained emotion detection with gated recurrent neural networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 718–728 (2017)
Baziotis, C., et al.: NTUA-SLP at SemEval-2018 task 1: predicting affective content in tweets with deep attentive RNNs and transfer learning. In: Proceedings of the 12th International Workshop on Semantic Evaluation, pp. 245–255 (2018)
Bowman, S.R., Angeli, G., Potts, C., Manning, C.D.: A large annotated corpus for learning natural language inference. arXiv preprint arXiv:1508.05326 (2015)
Chen, S.Y., Hsu, C.C., Kuo, C.C., Ku, L.W., et al.: Emotionlines: an emotion corpus of multi-party conversations. arXiv preprint arXiv:1802.08379 (2018)
Chen, Y., Yuan, J., You, Q., Luo, J.: Twitter sentiment analysis via bi-sense emoji embedding and attention-based LSTM. arXiv preprint arXiv:1807.07961 (2018)
Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
Hasan, M., Rundensteiner, E., Agu, E.: EMOTEX: detecting emotions in Twitter messages (2014)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Hsu, C.C., Ku, L.W.: SocialNLP 2018 emotionX challenge overview: recognizing emotions in dialogues. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 27–31 (2018)
Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014, Article ID 627892, 10 p. (2014). https://doi.org/10.1155/2014/627892
Khosla, S.: EmotionX-AR: CNN-DCNN autoencoder based emotion classifier. In: Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pp. 37–44 (2018)
Kim, Y., Lee, H., Jung, K.: AttnConvnet at SemEval-2018 task 1: attention-based convolutional neural networks for multi-label emotion classification. arXiv preprint arXiv:1804.00831 (2018)
Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
Lei, Z., Yang, Y., Yang, M., Liu, Y.: A multi-sentiment-resource enhanced attention network for sentiment classification. arXiv preprint arXiv:1807.04990 (2018)
Liew, J.S.Y., Turtle, H.R.: Exploring fine-grained emotion detection in tweets. In: Proceedings of the NAACL Student Research Workshop, pp. 73–80 (2016)
Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101 (2016)
Luo, L., Yang, H., Chin, F.Y.: EmotionX-DLC: self-attentive BiLSTM for detecting sequential emotions in dialogue. arXiv preprint arXiv:1806.07039 (2018)
Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp. 3111–3119 (2013)
Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Rozental, A., Fleischer, D., Kelrich, Z.: Amobee at IEST 2018: transfer learning from language models. arXiv:1808.08782
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)
Yang, Y., et al.: Learning semantic textual similarity from conversations. arXiv preprint arXiv:1804.07754 (2018)
Zhang, Q., Chen, X., Zhan, Q., Yang, T., Xia, S.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92, 84–90 (2017)
Acknowledgments
This work was supported by National Natural Science Foundation of China (61772036, 61331011) and Key Laboratory of Science, Technology and Standard in Press Industry (Key Laboratory of Intelligent Press Media Technology). We thank the anonymous reviewers for their helpful comments.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, Z., Wan, Z., Wan, X. (2019). BAB-QA: A New Neural Model for Emotion Detection in Multi-party Dialogue. In: Yang, Q., Zhou, ZH., Gong, Z., Zhang, ML., Huang, SJ. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2019. Lecture Notes in Computer Science(), vol 11439. Springer, Cham. https://doi.org/10.1007/978-3-030-16148-4_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-16148-4_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-16147-7
Online ISBN: 978-3-030-16148-4
eBook Packages: Computer ScienceComputer Science (R0)