Advertisement

Using Non-invasive Wearables for Detecting Emotions with Intelligent Agents

  • Jaime Andres RinconEmail author
  • Ângelo Costa
  • Paulo Novais
  • Vicente Julian
  • Carlos Carrascosa
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 527)

Abstract

This paper proposes the use of intelligent wristbands for the automatic detection of emotional states in order to develop an application which allows to extract, analyze, represent and manage the social emotion of a group of entities. Nowadays, the detection of the joined emotion of an heterogeneous group of people is still an open issue. Most of the existing approaches are centered in the emotion detection and management of a single entity. Concretely, the application tries to detect how music can influence in a positive or negative way over individuals’ emotional states. The main goal of the proposed system is to play music that encourages the increase of happiness of the overall patrons.

Notes

Acknowledgements

This work is partially supported by the MINECO/FEDER TIN2015-65515-C4-1-R and the FPI grant AP2013-01276 awarded to Jaime-Andres Rincon. This work is supported by COMPETE: POCI-01-0145-FEDER-007043 and FCT – Fundação para a Ciência e Tecnologia within the projects UID/CEC/00319/2013 and Post-Doc scholarship SFRH/BPD/102696/2014 (A. Costa)

References

  1. 1.
    Alemdar, H., Ersoy, C.: Wireless sensor networks for healthcare: a survey. Comput. Netw. 54(15), 2688–2710 (2010). http://dx.doi.org/10.1016/j.comnet.2010.05.003CrossRefGoogle Scholar
  2. 2.
    Baig, M.M., GholamHosseini, H., Connolly, M.J., Kashfi, G.: Real-time vital signs monitoring and interpretation system for early detection of multiple physical signs in older adults. In: IEEE-EMBS International Conference on Biomedical and Health Informatics. IEEE (2014). http://dx.doi.org/10.1109/BHI.2014.6864376
  3. 3.
    Castillo, J.C., Fernández-Caballero, A., Castro-González, Á., Salichs, M.A., López, M.T.: A framework for recognizing and regulating emotions in the elderly. In: Pecchia, L., Chen, L.L., Nugent, C., Bravo, J. (eds.) IWAAL 2014. LNCS, vol. 8868, pp. 320–327. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-13105-4_46CrossRefGoogle Scholar
  4. 4.
    Castillo, J.C., Serrano-Cuerda, J., Fernández-Caballero, A., Martínez-Rodrigo, A.: Hierarchical architecture for robust people detection by fusion of infrared and visible video. In: Novais, P., Camacho, D., Analide, C., El Fallah Seghrouchni, A., Badica, C. (eds.) Intelligent Distributed Computing IX. SCI, vol. 616, pp. 343–351. Springer, Heidelberg (2016). doi: 10.1007/978-3-319-25017-5_32CrossRefGoogle Scholar
  5. 5.
    Chuang, Z.J., Wu, C.H.: Multi-modal emotion recognition from speech and text. J. Comput. Linguist. Chin. 9(2), 45–62 (2004). http://www.aclweb.org/anthology/O/O04/O04-3004.pdfGoogle Scholar
  6. 6.
    Colby, B.N., Ortony, A., Clore, G.L., Collins, A.: The Cognitive Structure of Emotions, vol. 18. Cambridge University Press, Cambridge (1989)Google Scholar
  7. 7.
    Coutinho, E., Cangelosi, A.: Musical emotions: Predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements. Emotion 11(4), 921–937 (2011). (Washington, D.C.)CrossRefGoogle Scholar
  8. 8.
    Escriva, M., Palanca, J., Aranda, G., García-Fornes, A., Julian, V., Botti, V.: A Jabber-based multi-agent system platform. In: Proceedings of the Fifth International Joint Conference on Autonomous Agents and Multiagent Systems (AAMAS 2006), pp. 1282–1284. Association for Computing Machinery, Inc. (ACM Press) (2006)Google Scholar
  9. 9.
    Fensli, R., Pedersen, P.E., Gundersen, T., Hejlesen, O.: Sensor acceptance model - measuring patient acceptance of wearable sensors. Method Inf. Med. 47, 89–95 (2008). http://dx.doi.org/10.3414/ME9106CrossRefGoogle Scholar
  10. 10.
    Fishkin, K.P., Jiang, B., Philipose, M., Roy, S.: I sense a disturbance in the force: unobtrusive detection of interactions with RFID-tagged objects. In: Davies, N., Mynatt, E.D., Siio, I. (eds.) UbiComp 2004. LNCS, vol. 3205, pp. 268–282. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-30119-6_16CrossRefGoogle Scholar
  11. 11.
    Gratch, J., Marsella, S.: Tears and fears: modeling emotions and emotional behaviors in synthetic agents. In: Proceedings of the Fifth International Conference on Autonomous Agents, pp. 278–285. ACM (2001). http://dl.acm.org/citation.cfm?id=376309
  12. 12.
    Han, K., Yu, D., Tashev, I.: Speech emotion recognition using deep neural network and extreme learning machine. In: Fifteenth Annual Conference of Interspeech, pp. 223–227, September 2014. http://research.microsoft.com/pubs/230136/IS140441.PDF
  13. 13.
    Hert, P., Gutwirth, S., Moscibroda, A., Wright, D., Fuster, G.G.: Legal safeguards for privacy and data protection in ambient intelligence. Pers. Ubiquit. Comput. 13(6), 435–444 (2008). http://www.springerlink.com/index/10.1007/s00779-008-0211-6CrossRefGoogle Scholar
  14. 14.
    Hristoskova, A., Sakkalis, V., Zacharioudakis, G., Tsiknakis, M., Turck, F.D.: Ontology-driven monitoring of patient’s vital signs enabling personalized medical detection and alert. Sensors 14(1), 1598–1628 (2014). http://dx.doi.org/10.3390/s140101598CrossRefGoogle Scholar
  15. 15.
    Karthigayan, M., Rizon, M., Nagarajan, R., Yaacob, S.: Genetic algorithm and neural network for face emotion recognition. In: Affective Computing, pp. 57–68 (2008). http://cdn.intechopen.com/pdfs-wm/5178.pdfGoogle Scholar
  16. 16.
    Koelstra, S., Mühl, C., Soleymani, M., Lee, J.S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRefGoogle Scholar
  17. 17.
    Kuo, C.H., Chen, C.T., Chen, T.S., Kuo, Y.C.: A wireless sensor network approach for rehabilitation data collections. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics. Institute of Electrical & Electronics Engineers (IEEE) (2011). http://dx.doi.org/10.1109/ICSMC.2011.6083773
  18. 18.
    Maaoui, C., Pruski, A.: Emotion recognition through physiological signals for human-machine communication. In: Cutting Edge Robotics, pp. 317–333 (2010). http://www.intechopen.com/source/pdfs/12200/InTech-Emotion_recognition_through_physiological_signals_for_human_machine_communication.pdfGoogle Scholar
  19. 19.
    Maier, E., Kempter, G.: ALADIN - a magic lamp for the elderly? In: Nakashima, H., Aghajan, H., Augusto, J.C. (eds.) Handbook of Ambient Intelligence and Smart Environments, pp. 1201–1227. Springer, Berlin, Heidelberg (2010). http://dx.doi.org/10.1007/978-0-387-93808-0_44CrossRefGoogle Scholar
  20. 20.
    Mehrabian, A.: Analysis of affiliation-related traits in terms of the PAD temperament model. J. Psychol. 131(1), 101–117 (1997). http://dx.doi.org/10.1080/00223989709603508CrossRefGoogle Scholar
  21. 21.
    Piana, S., Odone, F., Verri, A., Camurri, A.: Real-time Automatic Emotion Recognition from Body Gestures. arXiv preprint arXiv:1402.5047, pp. 1–7 (2014). http://xxx.tau.ac.il/pdf/1402.5047.pdf
  22. 22.
    Ramos, J., Oliveira, T., Satoh, K., Neves, J., Novais, P.: Orientation system based on speculative computation and trajectory mining. In: Bajo, J., et al. (eds.) PAAMS 2016. CCIS, vol. 616, pp. 250–261. Springer, Heidelberg (2016). doi: 10.1007/978-3-319-39387-2_21CrossRefGoogle Scholar
  23. 23.
    Rincon, J.A., Julian, V., Carrascosa, C.: Social emotional model. In: Demazeau, Y., Decker, K.S., Bajo Pérez, J., de la Prieta, F. (eds.) PAAMS 2015. LNCS (LNAI), vol. 9086, pp. 199–210. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-18944-4_17CrossRefGoogle Scholar
  24. 24.
    Salmeron, J.L.: Fuzzy cognitive maps for artificial emotions forecasting. Appl. Soft Comput. J. 12(12), 3704–3710 (2012). http://dx.doi.org/10.1016/j.asoc.2012.01.015CrossRefGoogle Scholar
  25. 25.
    Scherer, K.R., Zentner, M.R.: Emotional effects of music: production rules. In: Music and Emotion: Theory and Research, pp. 361–392 (2001). http://icquran.persiangig.com/weblog/schererzentner.pdf
  26. 26.
    Stawarz, K., Cox, A.L., Blandford, A.: Don’t forget your pill! In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014. Association for Computing Machinery (ACM) (2014). http://dx.doi.org/10.1145/2556288.2557079
  27. 27.
    Thayer, R.: The Biopsychology of Mood and Arousal. Oxford University Press, Oxford (1989)Google Scholar
  28. 28.
    Tran, N., Coffman, J.M., Sumino, K., Cabana, M.D.: Patient reminder systems and asthma medication adherence: a systematic review. J. Asthma 51(5), 536–543 (2014). http://dx.doi.org/10.3109/02770903.2014.888572CrossRefGoogle Scholar
  29. 29.
    Villarejo, M.V., Zapirain, B.G., Zorrilla, A.M.: A stress sensor based on galvanic skin response (GSR) controlled by ZigBee. Sensors 12(5), 6075–6101 (2012). (Switzerland)CrossRefGoogle Scholar
  30. 30.
    Walter, M., Eilebrecht, B., Wartzek, T., Leonhardt, S.: The smart car seat: personalized monitoring of vital signs in automotive applications. Pers. Ubiquit. Comput. 15(7), 707–715 (2011). http://dx.doi.org/10.1007/s00779-010-0350-4CrossRefGoogle Scholar
  31. 31.
    Whitman, B., Smaragdis, P.: Combining musical and cultural features for intelligent style detection. In: Ismir, pp. 5–10, Paris, France (2002). http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.100.8383&rep=rep1&type=pdf
  32. 32.
    Yik, M., Russell, J.A., Steiger, J.H.: A 12-point circumplex structure of core affect. Emotion 11(4), 705–731 (2011)CrossRefGoogle Scholar
  33. 33.
    van der Zwaag, M.D., Westerink, J.H.D.M., van den Broek, E.L.: Emotional and psychophysiological responses to tempo, mode, and percussiveness. Musicae Scientiae 15(2), 250–269 (2011). http://msx.sagepub.com/content/15/2/250.shortCrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  • Jaime Andres Rincon
    • 1
    Email author
  • Ângelo Costa
    • 2
  • Paulo Novais
    • 2
  • Vicente Julian
    • 1
  • Carlos Carrascosa
    • 1
  1. 1.D. Sistemas Informáticos y ComputaciónUniversitat Politècnica de ValènciaValenciaSpain
  2. 2.Centro ALGORITMI, Escola de EngenhariaUniversidade do MinhoGuimarãesPortugal

Personalised recommendations