Abstract
The Covid-19 pandemic changed the course of activities, both work and education in the world, migrating to the requirement of virtual platforms and videoconferencing tools, such as Zoom, Google Meet, Jitsi Meet, among others. This generated a globalized and digital culture of learning, activities in congresses, and even business meetings using videoconferences. This new scenario creates uncertainty, especially in educators, due to the level of attention they are receiving from students through virtual classes and other scenarios where they want to evaluate the emotions created in the people who receive them information virtual written description intended to provide factual informationally. For this reason, to support different video conferencing platforms or other audiovisual media, a tool is presented that captures video in real-time. It automatically recognizes the emotions expressed by people using deep learning tools, happiness, sadness, surprise, anger, fear, disgust, and neutral emotions. The initial training and validation system is based on the CK+ Dataset that contains images distributed by emotions. This tool was developed for the WEB in Python Flask, which in addition to automatic recognition in real-time, generates statistics of the emotions of the people evaluated with 75% accuracy. To validate the tool, videoconferencing programs were used, the emotions of a group of students were evaluated, and open videos were available online on YouTube. With this study, it was possible to re-know the emotions of the people who attended the class, which allows the teacher to take measures if the students do not carry out the planned activities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bundele, M., Banerjee, R.: Detection of fatigue of vehicular driver using skin conductance and oximetry pulse: a neural network approach. In: iiWAS2009 - The 11th International Conference on Information Integration and Web-based Applications and Services, 2009, pp. 739–744 (2009). https://doi.org/10.1145/1806338.1806478
Li, C., Xu, C., Feng, Z.: Analysis of physiological for emotion recognition with IRS model. Neurocomputing 178, 103–111 (2015). https://doi.org/10.1016/j.neucom.2015.07.112
Londoño-Osorio, V., Marín-Pineda, J., Zuluaga, A., Isabel, E.: Introducción a la Visión Artificial mediante Prácticas de Laboratorio Diseñadas en Matlab. TecnoLógicas, pp. 591–603 (2013). https://doi.org/10.22430/22565337.350
Filipović, F., Baljak, L., Naumović, T., Labus, A., Bogdanović, Z.: Developing a web application for recognizing emotions in neuromarketing. In: Rocha, Á., Reis, J.L., Peter, M.K., Bogdanović, Z. (eds.) Marketing and Smart Technologies. SIST, vol. 167, pp. 297–308. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-1564-4_28
Rawnaque, F.S., et al.: Technological advancements and opportunities in neuromarketing: a systematic review. Brain Inform. 7(1), 1–19 (2020). https://doi.org/10.1186/s40708-020-00109-x
Chirra, V.R.R., Uyyala, S.R., Kolli, V.K.K.: Virtual facial expression recognition using deep CNN with ensemble learning. J. Ambient. Intell. Humaniz. Comput. 12(12), 10581–10599 (2021). https://doi.org/10.1007/s12652-020-02866-3
Mateus, J.-C., Andrada, P., González-Cabrera, C., Ugalde, C., Novomisky, S.: Teachers’ perspectives for a critical agenda in media education post COVID-19. A comparative study in Latin America. Comunicar 30(70), 9–19 (2022). https://doi.org/10.3916/C70-2022-01
Vivanco-Saraguro, A.: Teleducación en tiempos de COVID-19: brechas de desigualdad. CienciAmérica 9, 166–175 (2020). https://doi.org/10.33210/ca.v9i2.307
Rebollo-Catalan, A., Pérez, R., Sánchez, R.B., Buzón-García, O., Caro, L.: Las emociones en el aprendizaje online. Relieve: Revista Electrónica de Investigación y Evaluación Educativa 14, 1–23 (2014). https://doi.org/10.7203/relieve.14.1.4201, ISSN 1134-4032
El Hammoumi, O., Benmarrakchi, F., Ouherrou, N., El Kafi, J., El Hore, A.: Emotion recognition in e-learning systems. In: 2018 6th International Conference on Multimedia Computing and Systems (ICMCS), pp. 1–6 (2018). https://doi.org/10.1109/ICMCS.2018.8525872
Mega, C., Ronconi, L., De Beni, R.: What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. J. Educ. Psychol. 106(1), 121–131 (2014). https://doi.org/10.1037/a0033546
Goetz, T., Frenzel, A. Pekrun, R., Hall, N.: Emotional intelligence in the context of learning and achievement. In: Schulze, R., Roberts, R.D. (eds.) Emotional Intelligence: An international Handbook, pp. 233–253. Hogrefe & Huber Publishers, Cambridge (2005). ISBN 0-88937-283-7
Krithika, L., Lakshmi, G.G.: Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Comput. Sci. 85, 767–776 (2016). https://doi.org/10.1016/j.procs.2016.05.264
Sharma, A., Mansotra, V.: Deep learning based student emotion recognition from facial expressions in classrooms. Int. J. Eng. Adv. Technol. 8(6), 4691–4699 (2019). https://doi.org/10.35940/ijeat.F9170.088619
Darabian, H., et al.: Detecting cryptomining malware: a deep learning approach for static and dynamic analysis. J. Grid Comput. 18(2), 293–303 (2020). https://doi.org/10.1007/s10723-020-09510-6
Jain, A., Sah, H.: Student’s feedback by emotion and speech recognition through deep learning. In: Proceedings - IEEE 2021 International Conference on Computing, Communication, and Intelligent Systems, ICCCIS 2021, pp. 442–447 (2021). https://doi.org/10.1109/ICCCIS51004.2021.9397145
Welcome to Flask—Flask Documentation (2.1.x). https://flask.palletsprojects.com/en/2.1.x/. Accessed 25 Apr 2022
Al-Tuwaijari, J., Shaker, S.: Face detection system based viola-jones algorithm. In: Proceedings of the 6th International Engineering Conference “Sustainable Technology and Development”, IEC 2020, pp. 211–215 (2020). https://doi.org/10.1109/IEC49899.2020.9122927
Shah, R.: Face mask detection using convolution neural network. Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) (2021). https://doi.org/10.48550/arXiv.2106.05728
Arriaga, O., Plöger, P.G., Valdenegro, M.: Real-time convolutional neural networks for emotion and gender classification. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, pp. 221–226 (2019)
Gogate, U., Parate, A., Sah, S., Narayanan, S.: Real time emotion recognition and gender classification. In: Proceedings of the 2020 International Conference on Smart Innovations in Design, Environment, Management, Planning and Computing, ICSIDEMPC 2020, pp. 138–143 (2020). https://doi.org/10.1109/ICSIDEMPC49020.2020.9299633
Goodfellow, I.J., et al.: Challenges in representation learning: a report on three machine learning contests. In: Lee, M., Hirose, A., Hou, Z.-G., Kil, R.M. (eds.) ICONIP 2013. LNCS, vol. 8228, pp. 117–124. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-42051-1_16
Sun, L., Ge, C., Zhong, Y.: Design and implementation of face emotion recognition system based on CNN Mini_Xception frameworks. J. Phys. Conf. Ser. 2010, 012123 (2021). https://doi.org/10.1088/1742-6596/2010/1/012123
Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010, pp. 94–101 (2010). https://doi.org/10.1109/CVPRW.2010.5543262
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Paredes, N., Caicedo Bravo, E., Bacca, B. (2022). Real-Time Emotion Recognition Through Video Conference and Streaming. In: Abásolo, M.J., Olmedo Cifuentes, G.F. (eds) Applications and Usability of Interactive TV. jAUTI 2021. Communications in Computer and Information Science, vol 1597. Springer, Cham. https://doi.org/10.1007/978-3-031-22210-8_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-22210-8_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22209-2
Online ISBN: 978-3-031-22210-8
eBook Packages: Computer ScienceComputer Science (R0)