Skip to main content

Real-Time Emotion Recognition Through Video Conference and Streaming

  • Conference paper
  • First Online:
Applications and Usability of Interactive TV (jAUTI 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1597))

  • 154 Accesses

Abstract

The Covid-19 pandemic changed the course of activities, both work and education in the world, migrating to the requirement of virtual platforms and videoconferencing tools, such as Zoom, Google Meet, Jitsi Meet, among others. This generated a globalized and digital culture of learning, activities in congresses, and even business meetings using videoconferences. This new scenario creates uncertainty, especially in educators, due to the level of attention they are receiving from students through virtual classes and other scenarios where they want to evaluate the emotions created in the people who receive them information virtual written description intended to provide factual informationally. For this reason, to support different video conferencing platforms or other audiovisual media, a tool is presented that captures video in real-time. It automatically recognizes the emotions expressed by people using deep learning tools, happiness, sadness, surprise, anger, fear, disgust, and neutral emotions. The initial training and validation system is based on the CK+ Dataset that contains images distributed by emotions. This tool was developed for the WEB in Python Flask, which in addition to automatic recognition in real-time, generates statistics of the emotions of the people evaluated with 75% accuracy. To validate the tool, videoconferencing programs were used, the emotions of a group of students were evaluated, and open videos were available online on YouTube. With this study, it was possible to re-know the emotions of the people who attended the class, which allows the teacher to take measures if the students do not carry out the planned activities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bundele, M., Banerjee, R.: Detection of fatigue of vehicular driver using skin conductance and oximetry pulse: a neural network approach. In: iiWAS2009 - The 11th International Conference on Information Integration and Web-based Applications and Services, 2009, pp. 739–744 (2009). https://doi.org/10.1145/1806338.1806478

  2. Li, C., Xu, C., Feng, Z.: Analysis of physiological for emotion recognition with IRS model. Neurocomputing 178, 103–111 (2015). https://doi.org/10.1016/j.neucom.2015.07.112

    Article  Google Scholar 

  3. Londoño-Osorio, V., Marín-Pineda, J., Zuluaga, A., Isabel, E.: Introducción a la Visión Artificial mediante Prácticas de Laboratorio Diseñadas en Matlab. TecnoLógicas, pp. 591–603 (2013). https://doi.org/10.22430/22565337.350

  4. Filipović, F., Baljak, L., Naumović, T., Labus, A., Bogdanović, Z.: Developing a web application for recognizing emotions in neuromarketing. In: Rocha, Á., Reis, J.L., Peter, M.K., Bogdanović, Z. (eds.) Marketing and Smart Technologies. SIST, vol. 167, pp. 297–308. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-1564-4_28

    Chapter  Google Scholar 

  5. Rawnaque, F.S., et al.: Technological advancements and opportunities in neuromarketing: a systematic review. Brain Inform. 7(1), 1–19 (2020). https://doi.org/10.1186/s40708-020-00109-x

    Article  Google Scholar 

  6. Chirra, V.R.R., Uyyala, S.R., Kolli, V.K.K.: Virtual facial expression recognition using deep CNN with ensemble learning. J. Ambient. Intell. Humaniz. Comput. 12(12), 10581–10599 (2021). https://doi.org/10.1007/s12652-020-02866-3

    Article  Google Scholar 

  7. Mateus, J.-C., Andrada, P., González-Cabrera, C., Ugalde, C., Novomisky, S.: Teachers’ perspectives for a critical agenda in media education post COVID-19. A comparative study in Latin America. Comunicar 30(70), 9–19 (2022). https://doi.org/10.3916/C70-2022-01

  8. Vivanco-Saraguro, A.: Teleducación en tiempos de COVID-19: brechas de desigualdad. CienciAmérica 9, 166–175 (2020). https://doi.org/10.33210/ca.v9i2.307

    Article  Google Scholar 

  9. Rebollo-Catalan, A., Pérez, R., Sánchez, R.B., Buzón-García, O., Caro, L.: Las emociones en el aprendizaje online. Relieve: Revista Electrónica de Investigación y Evaluación Educativa 14, 1–23 (2014). https://doi.org/10.7203/relieve.14.1.4201, ISSN 1134-4032

  10. El Hammoumi, O., Benmarrakchi, F., Ouherrou, N., El Kafi, J., El Hore, A.: Emotion recognition in e-learning systems. In: 2018 6th International Conference on Multimedia Computing and Systems (ICMCS), pp. 1–6 (2018). https://doi.org/10.1109/ICMCS.2018.8525872

  11. Mega, C., Ronconi, L., De Beni, R.: What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. J. Educ. Psychol. 106(1), 121–131 (2014). https://doi.org/10.1037/a0033546

    Article  Google Scholar 

  12. Goetz, T., Frenzel, A. Pekrun, R., Hall, N.: Emotional intelligence in the context of learning and achievement. In: Schulze, R., Roberts, R.D. (eds.) Emotional Intelligence: An international Handbook, pp. 233–253. Hogrefe & Huber Publishers, Cambridge (2005). ISBN 0-88937-283-7

    Google Scholar 

  13. Krithika, L., Lakshmi, G.G.: Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Comput. Sci. 85, 767–776 (2016). https://doi.org/10.1016/j.procs.2016.05.264

    Article  Google Scholar 

  14. Sharma, A., Mansotra, V.: Deep learning based student emotion recognition from facial expressions in classrooms. Int. J. Eng. Adv. Technol. 8(6), 4691–4699 (2019). https://doi.org/10.35940/ijeat.F9170.088619

    Article  Google Scholar 

  15. Darabian, H., et al.: Detecting cryptomining malware: a deep learning approach for static and dynamic analysis. J. Grid Comput. 18(2), 293–303 (2020). https://doi.org/10.1007/s10723-020-09510-6

    Article  Google Scholar 

  16. Jain, A., Sah, H.: Student’s feedback by emotion and speech recognition through deep learning. In: Proceedings - IEEE 2021 International Conference on Computing, Communication, and Intelligent Systems, ICCCIS 2021, pp. 442–447 (2021). https://doi.org/10.1109/ICCCIS51004.2021.9397145

  17. Welcome to Flask—Flask Documentation (2.1.x). https://flask.palletsprojects.com/en/2.1.x/. Accessed 25 Apr 2022

  18. Al-Tuwaijari, J., Shaker, S.: Face detection system based viola-jones algorithm. In: Proceedings of the 6th International Engineering Conference “Sustainable Technology and Development”, IEC 2020, pp. 211–215 (2020). https://doi.org/10.1109/IEC49899.2020.9122927

  19. Shah, R.: Face mask detection using convolution neural network. Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) (2021). https://doi.org/10.48550/arXiv.2106.05728

  20. Arriaga, O., Plöger, P.G., Valdenegro, M.: Real-time convolutional neural networks for emotion and gender classification. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, pp. 221–226 (2019)

    Google Scholar 

  21. Gogate, U., Parate, A., Sah, S., Narayanan, S.: Real time emotion recognition and gender classification. In: Proceedings of the 2020 International Conference on Smart Innovations in Design, Environment, Management, Planning and Computing, ICSIDEMPC 2020, pp. 138–143 (2020). https://doi.org/10.1109/ICSIDEMPC49020.2020.9299633

  22. Goodfellow, I.J., et al.: Challenges in representation learning: a report on three machine learning contests. In: Lee, M., Hirose, A., Hou, Z.-G., Kil, R.M. (eds.) ICONIP 2013. LNCS, vol. 8228, pp. 117–124. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-42051-1_16

    Chapter  Google Scholar 

  23. Sun, L., Ge, C., Zhong, Y.: Design and implementation of face emotion recognition system based on CNN Mini_Xception frameworks. J. Phys. Conf. Ser. 2010, 012123 (2021). https://doi.org/10.1088/1742-6596/2010/1/012123

    Article  Google Scholar 

  24. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010, pp. 94–101 (2010). https://doi.org/10.1109/CVPRW.2010.5543262

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nancy Paredes .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Paredes, N., Caicedo Bravo, E., Bacca, B. (2022). Real-Time Emotion Recognition Through Video Conference and Streaming. In: Abásolo, M.J., Olmedo Cifuentes, G.F. (eds) Applications and Usability of Interactive TV. jAUTI 2021. Communications in Computer and Information Science, vol 1597. Springer, Cham. https://doi.org/10.1007/978-3-031-22210-8_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22210-8_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22209-2

  • Online ISBN: 978-3-031-22210-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics