Advertisement

Correlation between selected gait variables and emotion using virtual reality

  • Young Kim
  • JunYoung Moon
  • Nak-Jun Sung
  • Min HongEmail author
Original Research
  • 3 Downloads

Abstract

Gait pattern and its characteristics can change according to the person’s emotional state. We implemented a data collection system that analyzes the relationship between gait and emotional state by using Virtual Reality (VR) environment and a mat-type pressure sensor equipped with 1008 sensors. Twelve healthy young adults (6 F, 6 M) in their 20 s participated in this study and randomly watched 3 different types of videos containing calming, sad, and joyful scenes that were presented in 3D formats with 360° angle. Modified Differential Emotion Scale (m-DES) was used by the subjects to self-report their current emotion before and after watching each set of the videos, and their real-time gait patterns captured by the mat sensor were analyzed. For gait pattern analysis, stepcount per minute, gait speed per minute, plantar pressure distribution (p1–p8), and peak plantar pressure were compared between conditions. The results showed that both step count and gait speed significantly increased in joyful state compared to those of calm and sad emotional states, and were decreased the most in sad state. High correlation was found between joyful emotion and faster gait speed, as well as higher plantar pressure in the forefoot. Plantar pressure distribution and its high peaks showed to be weighted in the 1st and 2nd metatarsal area in joyful state, and more concentrated in the heel area in sad state. Gait patterns in calm emotional state were not significantly differentiated. VR can be an useful tool to self-manage negative emotions as well as to help increase physical activities for depressive and cheerless individuals.

Keywords

Wellness technology Gait analysis Emotional state Virtual reality 

Notes

Acknowledgements

This research was supported by the Bio & Medical Technology Development Program of the National Research Foundation (NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2015M3A9D7067388) and was supported by the Soonchunhyang University Research Fund.

References

  1. Bang G, Yang J, Oh K, Ko I (2017) Interactive experience room using infrared sensors and user’s poses. J Inf Process Syst 13:876–892Google Scholar
  2. Binh NT, Khare A, Thanh NC (2017) Human tracking based on context awareness in outdoor environment. KSII T Internet Info 11:3104–3120.  https://doi.org/10.3837/tiis.2017.06.017 Google Scholar
  3. Birch I, Birch T, Bray D (2016) The identification of emotions from gait. Sci Justice 56:351–356.  https://doi.org/10.1016/j.scijus.2016.05.006 CrossRefGoogle Scholar
  4. Chambers HG, Sutherland DH (2002) A practical guide to gait analysis. J Am Acad Orthop Surg 10:222–231CrossRefGoogle Scholar
  5. Cho BH et al (2002) The effect of virtual reality cognitive training for attention enhancement. Cyberpsychol Behav 5:129–137.  https://doi.org/10.1089/109493102753770516 CrossRefGoogle Scholar
  6. Halovic S, Kroos C (2018a) Walking my way? Walker gender and display format Confounds the perception of specific emotions. Hum Mov Sci 57:461–477.  https://doi.org/10.1016/j.humov.2017.10.012 CrossRefGoogle Scholar
  7. Halovic S, Kroos C (2018b) Not all is noticed: kinematic cues of emotion-specific gait. Hum Mov Sci 57:478–488CrossRefGoogle Scholar
  8. Han J, Bhanu B (2006) Individual recognition using gait energy image. IEEE Trans Pattern Anal Mach Intell 28:316–322.  https://doi.org/10.1109/TPAMI.2006.38 CrossRefGoogle Scholar
  9. Hyman SE (2007) The neurobiology of addiction: implications for voluntary control of behavior. Am J Bioeth 7:8–11.  https://doi.org/10.1080/15265160601063969 CrossRefGoogle Scholar
  10. Izard CE, Dougherty FE, Bloxom BM, Kotsch NE (1974) The Differential Emotions Scale: a method of measuring the meaning of subjective experience of discrete emotions. Vanderbilt University, Department of Psychology, NashvilleGoogle Scholar
  11. Kale A, Sundaresan A, Rajagopalan AN, Cuntoor NP, Roy-Chowdhury AK, Kruger V, Chellappa R (2004) Identification of humans using gait. IEEE T Image Process 13:1163–1173CrossRefGoogle Scholar
  12. Karg M, Kuhnlenz K, Buss M (2010) Recognition of affect based on gait patterns. IEEE T Syst Man Cy B 40:1050–1061.  https://doi.org/10.1109/TSMCB.2010.2044040 CrossRefGoogle Scholar
  13. Kim SK, Kang SH, Choi YJ, Choi MH, Hong M (2017) Augmented-Reality survey: from concept to application. KSII T Internet Info 11:982–1004Google Scholar
  14. Kim D, Cho Y, Park KS (2018) Comparative analysis of affective and physiological responses to emotional movies. Hum Cent Comput Inf Sci 8:1–17CrossRefGoogle Scholar
  15. Kirtley C (2006) Clinical gait analysis: theory and practice. Elsevier Limited, LondonGoogle Scholar
  16. Lee SY, Bae SS (2009) The studies on the foot stability and kinesiology by direction of carry a load during gait. J Kor Soc Phys Ther 21:97–101Google Scholar
  17. Lee L, Grimson WEL (2002) Gait analysis for recognition and classification. In: Proceedings of the fifth IEEE international conference on automatic face gesture recognition. IEEE, Washington, DC, USA, pp 148–155.  https://doi.org/10.1109/AFGR.2002.1004148 Google Scholar
  18. Levesque J, Beauregard M, Mensour B (2006) Effect of neurofeedback training on the neural substrates of selective attention in children with attention-deficit/hyperactivity disorder: a functional magnetic resonance imaging study. Neurosci Lett 394:216–221.  https://doi.org/10.1016/j.neulet.2005.10.100 CrossRefGoogle Scholar
  19. Li S, Cui L, Zhu C, Li B, Zhao N, Zhu T (2016) Emotion recognition using Kinect motion capture data of human gaits. Peer J 4:1–17Google Scholar
  20. Li X, Lu G, Yan J, Li H, Zhang Z, Sun N, Xie S (2019) Incomplete Cholesky decomposition based Kernel cross modal factor analysis for audiovisual continuous dimensional emotion recognition. KSII Trans Internet Inf Syst 13:810–831Google Scholar
  21. Lorenzetti V, Melo B, Basílio R, Suo C, Yücel M, Tierra-Criollo CJ, Moll J (2018) Emotion regulation using virtual environments and real-time fMRI neurofeedback. Front Neurol 9:1–15CrossRefGoogle Scholar
  22. Michalak J, Troje NF, Fischer J, Vollmar P, Heidenreich T, Schulte D (2009) Embodiment of sadness and depression-gait patterns associated with dysphoric mood. Psychosom Med 71:580–587.  https://doi.org/10.1097/PSY.0b013e3181a2515c CrossRefGoogle Scholar
  23. Philippot P (1993) Inducing and assessing differentiated emotion-feeling states in the laboratory. Cogn Emot 7:171–193.  https://doi.org/10.1080/02699939308409183 CrossRefGoogle Scholar
  24. Qian J, Miao X, Shen Y (2012) Experimental study and simulation on compression character of warp knitted spacer fabrics. Comput Mater Continua (CMC) 27:179–188Google Scholar
  25. Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(15):11–32.  https://doi.org/10.1167/9.6.15 Google Scholar
  26. Song GB, Park EC (2015) Effect of virtual reality games on stroke patients’ balance, gait, depression, and interpersonal relationships. J Phys Ther Sci 27:2057–2060.  https://doi.org/10.1589/jpts.27.2057 CrossRefGoogle Scholar
  27. Takakusaki K (2017) Functional neuroanatomy for posture and gait control. J Mov Disord 10:1–17.  https://doi.org/10.14802/jmd.16062 CrossRefGoogle Scholar
  28. Troje NF, Westhoff C, Lavrov M (2005) Person identification from biological motion: effects of structural and kinematic cues. Percept Psychophys 67:667–675CrossRefGoogle Scholar
  29. Wang L, Tan T, Ning H, Hu W (2003) Silhouette analysis based gait recognition for human identification. IEEE TPAMI 25:1505–1518CrossRefGoogle Scholar
  30. Westhoff C, Troje NF (2007) Kinematic cues for person identification from biological motion. Percept Psycho Phys 69:241–253CrossRefGoogle Scholar
  31. Whittle MW (1991) Gait analysis: an introduction. Butterworth Heinemann, Halley Court, Jordan Hill, OxfordGoogle Scholar
  32. Yoo HN, Chung E, Lee BH (2013) The effects of augmented reality-based Otago Exercise on balance, gait, and falls efficacy of elderly women. J Phys Ther Sci 25:797–801.  https://doi.org/10.1589/jpts.25.797 CrossRefGoogle Scholar
  33. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE TPAMI 31:39–58.  https://doi.org/10.1109/TPAMI.2008.52 CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  • Young Kim
    • 1
  • JunYoung Moon
    • 2
  • Nak-Jun Sung
    • 2
  • Min Hong
    • 3
    Email author
  1. 1.Wellness Coaching Service Research CenterSoonchunhyang UniversityAsanRepublic of Korea
  2. 2.Department of Computer ScienceSoonchunhyang UniversityAsanRepublic of Korea
  3. 3.Department of Computer Software EngineeringSoonchunhyang UniversityAsanRepublic of Korea

Personalised recommendations