Advertisement

Virtual Environment for Monitoring Emotional Behaviour in Driving

  • Claude Frasson
  • Pierre Olivier Brosseau
  • Thi Hong Dung Tran
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8474)

Abstract

Emotions are an important behaviour of humans and may arise in driving situations. Uncontrolled emotions can lead to harmful effects. To control and reduce the negative impact of emotions, we have built a virtual driving environment in which we can capture and analyse emotions felt by the driver using EEG systems. By simulating specific emotional situations we can provoke these emotions and detect their types and intensity according to the driver. Then, in the environment, we generate corrective actions that are able to reduce the emotions. After a training period, the driver is able to correct the emotions by himself.

Keywords

Emotions Simulation EEG Driving Emotional state 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jonsson, I.M., Nass, C., Harris, H., Takayama, L.: Matching In-Car Voice with Driver State: Impact on Attitude and Driving Performance. In: Proceedings of the Third International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, pp. 173–181 (2005)Google Scholar
  2. 2.
    Nass, C., Jonsson, I.M., Harris, H., Reaves, B., Endo, J., Brave, S., Takayama, L.: Improving Automotive Safety by Pairing Driver Emotion and Car Voice Emotion. In: Proc. CHI (2005)Google Scholar
  3. 3.
    Schuller, B., Lang, M., Rigoll, G.: Recognition of Spontaneous Emotions by speech within Automotive Environment. In: Proc. 32. Deutsche Jahrestagung für Akustik (DAGA), Braunschweig, Germany, pp. 57–58 (2006)Google Scholar
  4. 4.
    Setiawan, P., Suhadi, S., Fingscheidt, T., Stan, S.: Robust Speech Recognition for Mobil Devices in Car Noise. In: Proc. Interspeech, Lisbon, Portugal (2005)Google Scholar
  5. 5.
    Grimm, M., Kroschel, K., Narayanan, S.: Support vector regression for automatic recognition of spontaneous emotions in speech. In: Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Proccessing (ICASSP) (2007) (accepted for publication)Google Scholar
  6. 6.
    Grimm, M., Kroschel, K., Harris, H., Nass, C., Schuller, B., Rigoll, G., Moosmayr, T.: On the Necessity and Feasibility of Detecting a Driver’s Emotional State While Driving. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 126–138. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  7. 7.
    Al-Shihabi, T., Mourant, R.R.: Toward more realistic driving behavior models for autonomous vehicles in driving simulators. Transportation Research Record 1843, 41–49 (2003)Google Scholar
  8. 8.
    Lisetti, C.L., Nasoz, F.: Using noninvasive wearable computers to recognize Human Emotions from Physiological Signals. EURASIP Journal on Applied Signal Processing 11, 1672–1687 (2004)CrossRefGoogle Scholar
  9. 9.
    Balling, O., Knight, M.R., Walters, B., Sannier, A.: Collaborative Driving Simulation. In: SAE 2002 World Congress & Exhibition, Session: Vehicle Dynamics & Simulation (Part A), Detroit, MI, USA, Document Number: 2002-01-1222 (March 2002)Google Scholar
  10. 10.
    CNN news, CNN News Health Study: 16 million might have road rage disorder (June 5, 2006), http://www.cnn.com/2006/HEALTH/06/05/road.rage.disease.ap/
  11. 11.
    Cowie, R., Douglas-Cowie, E., Cox, C.: Beyond emotion archetypes: Databases for emotion modeling using neural networks. Neural Networks 18(4) (May 2005); Emotion and Brain, pp. 371-388, DSC 2007 North America – Iowa City (September 2007)Google Scholar
  12. 12.
    NHTSA, Traffic safety facts, DOT HS 809 848, NHTSA annual report. Washington, USA (2005)Google Scholar
  13. 13.
    Cai, H., Lin, Y., Mourant, R.R.: Study on Driver Emotion in Driver-Vehicle-Environment Systems Using Multiple Networked Driving Simulators. DSC 2007 North America – Iowa City (September 2007)Google Scholar
  14. 14.
    Jones, C., Jonsson, I.M.: Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses. In: Proc. OZCHI (2005)Google Scholar
  15. 15.
    Schuller, B., Arsic, D., Wallhoff, F., Rigoll, G.: Emotion Recognition in the Noise Applying Large Acoustic Feature Sets. In: Proc. Speech Prosody, Dresden, Germany (2006)Google Scholar
  16. 16.
    Stevens, R.H., Galloway, T., Berka, C.: EEG-Related Changes in Cognitive Workload, Engagement and Distraction as Students Acquire Problem Solving Skills. In: Conati, C., McCoy, K., Paliouras, G. (eds.) UM 2007. LNCS (LNAI), vol. 4511, pp. 187–196. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  17. 17.
    Chaouachi, M., Jraidi, I., Frasson, C.: Modeling Mental Workload Using EEG Features for Intelligent Systems. In: Konstan, J.A., Conejo, R., Marzo, J.L., Oliver, N. (eds.) UMAP 2011. LNCS, vol. 6787, pp. 50–61. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  18. 18.
    Eyben, F., Wöllmer, M., Poitschke, T., Schuller, B., Blaschke, C., Färber, B., Nguyen-Thien, N.: Emotion on the Road—Necessity, Acceptance, and Feasibility of Affective Computing in the Car. In: Advances in Human-Computer Interaction, vol. 2010, 263593, 17 p. (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Claude Frasson
    • 1
  • Pierre Olivier Brosseau
    • 1
  • Thi Hong Dung Tran
    • 1
  1. 1.Département d’informatique et de recherche opérationnelleUniversité de MontréalMontréalCanada

Personalised recommendations