A Suggestion to Improve User-Friendliness Based on Monitoring Computer User’s Emotions

  • Keum Young SungEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10289)


Even with a big progress with computing facilities including notebook PC, tap computers, and smartphones, there is still no consideration at all to deal with users’ emotions while they use computing devices. In this study, a primitive idea with some technique is suggested to monitor users’ emotional changes while using computing facility. With the help of some physiological signals collected from sensors attached to a mouse or keyboard, the application programs or OS can be made to behave or respond some appropriate actions depending on users’ emotional status. Two typical body signals used in this study are finger temperature and skin resistance, which may be measured with a keyboard and a mouse attached with sensors. To respond to user’s emotions, an application program should be equipped with some components that are able to analyze user’s abrupt emotion changes through input sensors. To fully use affective techniques developed with application software and also operating system, the functionalities dealing with body signals should be included into the programs, and also various sensing techniques for measuring user’s temperature and skin resistance should be attached to input devices. The difficulties related to affective UI is to make a general indication of many users’ emotions, and to extract particular emotions based on body temperature and skin resistance that are measured with input devices, which is related to the individualization of body signals.


Affective computing Emotion communication Skin resistance Skin temperature 


  1. 1.
    Picard, R.W.: Affective Computing For HCIGoogle Scholar
  2. 2.
    Kim, N.S.: And perspectives on emotion recognition technologies. Telecommun. Rev. 19(5) (2009)Google Scholar
  3. 3.
    Cowie, R., et al.: Emotion recognition in human computer interaction. IEEE Signal Process. Mag. 18(1), 32–80 (2001)CrossRefGoogle Scholar
  4. 4.
    Zeng, Z., Pantic, M., Roisman, G.I., Huang, S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)CrossRefGoogle Scholar
  5. 5.
    Lee, C.M., Narayanan, S.S.: Toward detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2), 293–303 (2005)CrossRefGoogle Scholar
  6. 6.
    Katsis, C.D., Katertsidis, N., Ganiatsas, G., Fotiadis, D.I.: Toward emotion recognition in car-racing drivers: a biosignal processing approach. IEEE Trans. Syst. Man Cybern. Part A 38(3), 502–512 (2008)CrossRefGoogle Scholar
  7. 7.
    DeSilva, L.C., Miyasato, T., Nakatsu, R.: Facial emotion recognition using multi-modal information. In: Proceedings IEEE International Conference on Communication, Signal Process, pp. 397–401 (1997)Google Scholar
  8. 8.
    Cacioppo, J.T., Tassinary, L.G.: Inferring psychological significance from physiological signals. Amer. Psychol. 45(1), 16–28 (1990)CrossRefGoogle Scholar
  9. 9.
    Hudlicka, E.: To feel of not to feel: the role of affect in human-computer interaction. Int. J. Hum.-Comput. Stud. 59, 1–32 (2003)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.School of Computer Science and Electronic EngineeringHandong Global UniversityPohangRepublic of Korea

Personalised recommendations