User Modelling: An Empirical Study for Affect Perception Through Keyboard and Speech in a Bi-modal User Interface
This paper presents and discusses an empirical study that has been conducted among different kinds of computer users. The aim of the empirical study was to find out how computer users react when they face situations which generate emotions while they interact with a computer. The study has focused on two modes of human-computer interaction namely input from the keyboard and the microphone. The results of the study have been analyzed in terms of characteristics of users that have taken part (age, educational level, computer knowledge, etc.). These results were used to create a user modeling component that monitors users silently and records their actions (in the two modes of interaction) which are then interpreted in terms of their feelings. This user modeling component can be incorporated in any application that provides adaptive interaction to users based on affect perception.
Unable to display preview. Download preview PDF.
- 1.Brusilovsky, P.: Adaptive Hypermedia, User Modeling and User-Adapted Interaction, vol. 11, pp. 87–110. Springer Science+Business Media B.V, Heidelberg (2001)Google Scholar
- 2.Oviatt, S.: User-modeling and evaluation of multimodal interfaces. In: Proceedings of the IEEE, pp. 1457–1468 (2003)Google Scholar
- 3.Pantic, M., Rothkrantz, L.J.M.: Toward an affect-sensitive multimodal human-cumputer interaction. In: Proceedings of the IEEE, vol. 91, pp. 1370–1390 (2003)Google Scholar
- 4.Sharma, R., Yeasin, M., Krahnstoever, N., Rauschert, I., Cai, G., Brewer, I., Maceachren, A.M., Sengupta, K.: Speech-Gesture driven multimodal interfaces for crisis management. In: Proceedings of the IEEE, vol. 91, pp. 1327–1354 (2003)Google Scholar
- 5.Stathopoulou, I.O., Tsihrintzis, G.A.: Detection and Expression Classification System for Face Images (FADECS). In: IEEE Workshop on Signal Processing Systems, Athens, Greece (November 2-4, 2005)Google Scholar