Advertisement

Development of Mixed Reality Systems to Support Therapies

  • Bruno PatrãoEmail author
  • Paulo Menezes
  • Paula Castilho
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 470)

Abstract

This work is focused on the development of Mixed Reality-based Systems suitable for their use in psychological therapies. Recently, on-going research about Immersive Virtual Environments has shown its applicability in psychological domains. Firstly, we intend to explore the bio signals data for emotion recognition. Secondly, we aim to develop an innovative system that allows the creation and manipulation of experiments oriented to psychological therapy. Finally, these experiments will be tested and evaluated in real scenarios by therapists with clinical patients. The results from this work will enable the assessment of the efficacy of these experiments and its improvement in order to apply in a generalized way to individuals with emotional, behavioural and psychological problems.

Keywords

Affective computing Sensing technologies Embodiment Mixed reality Computer graphics 

1 Introduction

Virtual reality has been establishing itself as a powerful tool for the treatment of panic disorder and anxiety. Among these disorders, virtual reality has been used, for example, for the treatment of: fear of flying, driving, heights, public speaking, storms; claustrophobia; agoraphobia; arachnophobia; social phobia; panic disorder and post-traumatic stress due to traffic accidents.

Research carried out so far, where it uses virtual reality to exposure therapy, demonstrated the potential of this technology for the treatment of various types of phobias. The most common and effective treatment of certain phobias is through gradual exposure, i.e., the patient is exposed to conditions that stimulate gradually anxiety. Initially the anxiety increases although as the treatment progresses it tends to decreases. Another form of treatment is through imagery and recall of the dangerous situations and describing the experience to the therapist [1].

Blascovich et al. enumerated the methodological advantages of virtual-reality-based studies for social psychological research including increased safety and control, the treatment is more efficient and it becomes easier to schedule; it is more efficient and effective for therapists realize the concerns of patients, streamlining the diagnostic and treatment processes; unlimited number of repetitions of feared situations, and the therapist has better control in the environment in which the patient is exposed [2]. Virtual reality can be used to induce affect in the treatment of anxiety, particularly for exposure therapy. A meta-analysis showed effects comparable to clinical in vivo exposures [3].

For virtual reality to be effective in the treatment, it is necessary that the virtual environments are able to cause anxiety in patients, i.e. there is a need for patients to have a sense of being in fact experiencing a situation. In virtual reality, this is associated with the concepts of presence and immersion. The feeling of presence refers to the sense of involvement and excitement with the environment and the objects in it. When experiencing virtual environments, the sense of presence is a key concept, which is translated by the feeling of being present in another location or space [7]. Regarding the immersion, it is a psychological state that is characterized by a sense of involvement in the environment. This perception is created through images, sounds, and other stimuli that allow the interaction with an environment that provides a series of stimuli and experiences to the participant. The sense of presence is vital in any experience of this kind, as it will define if the participant will perceive or not the virtual world as real. One way to achieve the sense of presence is through multisensory stimulation. By increasing the number of senses stimulated in a virtual reality, it is possible to dramatically improve the feeling of presence of a user and consequently, his/her satisfaction and memories of the experiment.

1.1 Motivation

The general aim of this work is to develop a system capable of induce, identify and classify human emotions and physiological activation. We intend to explore two complementary parts: the induction of basic emotions related to threat cues (e.g. self-disgust, anxiety) which are consistently associated with psychopathology, and on the other side, the evaluation of physiological reaction, reflexive behaviours and identification of stimuli linked to emotion recognition. Thus, it will be possible to have a closed loop system capable of controlling the emotional state of the patient during a therapeutic process.

The first part will be based on the analysis of body motion (e.g. body language, eye tracking, facial expression) and measured bio-signals (e.g., heart rate, body temperature, skin conductance, respiratory rate). The second part will drive the development of a framework specially designed for the intended goals of therapeutic procedures, in which visual, auditory or haptic stimuli will be combined.

Finally, the developed solution will be tested and evaluated in real therapeutic scenarios, under the supervision of specialists and following the ethical and legal procedures. This work will enable the assessment of the efficacy of these technologies and how they can be improved and applied in a generalised way to individuals with psychological and emotional problems.

2 Cyber-Physical Systems

The recent availability of low cost immersive devices, the growing computing power of portable devices and cloud services create excellent opportunities for the application of virtual realty systems in therapeutic contexts, which is one of the areas that can have a real benefit with these systems.

The proposed tool intends to be used in therapeutic context but not only limited to it. Using the advantage of nowadays fast Internet connections it is possible to use this tool anywhere and the therapist can always remotely design or control the experience on the fly. Actually, there are two main advantages of using this kind of immersive technologies, first is that the therapist can manipulate all the environment and guide the patient through a specific situation without physically interfere, it can be applied to most of exposure based therapies. The second advantage is that the therapist can, in real time, observe the physiological responses of the patient and act in accordance to it. Furthermore, the therapist can have a virtual representation in the same virtual environment as the patient.

3 State of the Art

In a virtual reality environment, participants are exposed to digital contents representing real-world scenarios, people, objects, and events, which once combined with the full user’s body tracking enables him/her immersion and natural interaction with artificial worlds. Virtual reality permits the participant to come in contact with immersion in scenarios and explore the environments through first-person perspectives as opposed to viewing the scene from stationary positions or third-person views, what generates different impacts on the user experience [4, 5]. Virtual reality has great potential as a method for the induction of affects and emotions; nevertheless this has been rarely used until now.

Rizzo et al. proposed virtual warfare scenarios to use therapeutically with soldiers with post-war trauma [6]. Also relevant, are studies conducted by Mel Slater et al., where they developed virtual environments for the study of neurological rehabilitation and in other psychological domains [8].

Other notable work is project EMMA (Engaging Media for Mental Health Applications) exploring the contribution of emotions have on “presence”, which refers to the feeling of being immersed in a virtual environment. In this project a virtual urban park combined with multisensory stimuli (e.g., sounds, sights, lightning effects and other forms of affective stimuli) was used to induce changes such as anxiety or relaxation [9].

The success of these immersive systems depends on the “sense of presence” level that they can induce on a participant taking part in such experiences. The sense of presence refers to the sense of “being there”, in the virtual environments created by the technology. Botvinick & Cohen in their work known as “rubber hand illusion” [10], found that a fake body part can be incorporated into human body representation through synchronous multi-sensory stimulation on the fake and corresponding real body part.

For instance, people experiencing virtual reality may have the illusion of being in a virtual place and, consequently, carry out actions as if the situation and events depicted were really happening. Accompanying these actions, we may observe physiological changes (e.g., heart rate, body temperature, skin conductance, and respiratory rate), reflexive behaviours (e.g., eye blinking or smiling at a virtual human character) and emotions arousal. As far as we know, few authors had combined virtual reality with physiological data.

4 Research Contributions and Innovation

The expected contributions of the current work are the analysis of body motion and bio signals data for emotion recognition and the development of an innovative framework that allows the creation and manipulation of experiments oriented to psychological therapy, to this end we have created different setups.

Figure 1 represents the diagram for the proposed system; the dashed boxes represent the on-going work. Furthermore, the therapist can manipulate some parameters of Stimuli System, such as, scene type and intensity of stimuli (input data) and receive the patient’s biofeedback (output data).
Fig. 1.

Proposed system diagram.

The first prototype was designed to elicit fear and avoidance reactions. It consists in a virtual mirror room and, at a certain point, instructing the participant to touch the mirror triggers a set of grim visual and auditory events that culminate in the severance of the avatar’s hand. To construct this prototype several steps had to be followed in order to produce the necessary components that upon integration result in the intended setup. These steps are described in the next subsections.

4.1 Model Acquisition System

First we start to reconstruct the user’s 3D model so he/she can have a virtual representation (Fig. 2). The 3D reconstruction solves two problems by itself; in one hand is the user’s height being used to place the point of view on the virtual world helping the spatial awareness, because environments are modulated in a 1:1 scale, in other hand, the user will be able to have a fastest acceptance of their virtual representation and consequently a better immersion feeling.
Fig. 2.

User 3D reconstruction.

4.2 Bio-signals System

It is well known that physiological signals change in response to physical activities or emotional changes. There are morphological and physiological connections between these emotional states and autonomic nervous system, including its sympathetic and parasympathetic divisions. Sympathetic (SNA) and parasympathetic nervous activity (PNA) contributes to those responses, such as, the heart rate variability, skin conductance variability or body temperature. While the PNA controls the body’s response in rest, SNA is responsible for the internal response in body’s fight-or-flight, meaning the reaction that occurs in response to a perceived harmful event, attack, or threat to survival. However, it is very difficult to identify a direct correlation between physiological data and a particular emotion, but possible to detect the arousal of an emotion when presenting specific contextual stimulus, i.e., it is possible to detect fear when presenting a horror situation.

The following statements describe how we will use the physiological data in our system.

Electrocardiography. Essential tool to access the Heart Rate Variability (HRV) associated with changes in the Heart Rate (HR) strongly related to stimuli activation.

Electro-Dermal Activity. Measures electrical skin conductance varying with the wet level of sweat glands. These are controlled by the sympathetic system that affects EDA once it is active, especially when an individual is anxious or stressed.

Body Temperature. Varies depending on the place where it is measured. In our system the body temperature is measured upon the skin and upper torso.

Body Acceleration. When attached to a moving body, accelerometers are capable of accurately sensing it movements. We use it in several body parts (arms, torso, head) to detect psychomotor agitation.

Breathing Activity. The process involves the movement of the diaphragm, which is expressed in movements of expanding and compressing of the rib cage and abdominal area. This produces patterns that are used by the system to identify events.

Aiming to detect emotional arousal, the signals are acquired from a set of sensors, pre-processed to remove any noise and produce the features needed by the classification process.

4.3 Stimuli System

The majority of the development of immersive systems focuses on vision since it is the most dominant sense. Thus, the head mounted displays (HMD) are considered the core technologies for this end. By consequence the essential ingredient of Virtual Reality is a tracked HMD that lets user see new views of the visual world as he/she moves his head. Wearing an HMD, the user can look around and see the simulated virtual world just like in the real world.

Furthermore, objects producing sounds or the sound of objects interacting with other objects are part of our daily experience. Humans as many animals have the ability to detect the sound source location (i.e. where the sound is coming from). The synchronicity of the sounds and object-related events and coherency between object motion seen and the corresponding sound source displacement is something that we humans are very sensible to. In other words if synchronicity and coherency fail they are immediately detected, affecting the perception and immersion in the environment. For this end we are using the HMD Oculus Rift with audio phones combined with our graphical engine, OpenAIR.

4.4 Tracking System

With a level of importance similar to the sensory stimuli systems used, the tracking system is the cornerstone that transforms a simple wearable visualisation system into an immersive system, where the user can be active. An unobtrusive tracking mechanism is required to register any head and body motion and providing the data to the computer to make the required changes in viewpoint and position.

In order to have a full body tracker we are using a RGB-D sensor (Kinect) to collect all joints orientation and map them to the 3D model skeleton. This allows the user to interact with the virtual world and have the virtual representation mimicking, synchronously his/her movements, thus improving the immersion.

Additionally, there are reflexive behaviours when experiencing virtual environments very important to collect, such as, gaze orientation, eye blinking or smiling at a virtual character. In order to get this information, we are at the moment designing a prototype with cameras tracking the lower face and eye movements to be installed on the HMD covering that entire region.

5 Discussion of Results and Critical View

We started to develop a small virtual room with a mirror where the participant can freely move and look around. At the mirror, his/her reflection mimics all the movements creating a good way to adapt and feel immersed in the system. After a minute of adaptation the user is asked to touch his/her own reflection and when he/her does it a guillotine falls cutting the hand and left it bleeding. At this point different participant’s reactions occur in their motor and physiological systems.

With this experiment in the virtual mirror prototype, our tests showed high peaks in Electro-dermal Activity (EDA) on most participants, confirming that physiological data has a direct relation with emotional responses and stimuli reactions (Fig. 3).
Fig. 3.

Immersive Virtual Mirror Experiment (dashed line represents the moment when guillotine falls).

Fig. 4.

Stressful environment: Haunted house (left). Relaxing environment: Zen garden (right).

After these preliminary results, we decided to add more scenarios, and extend our data acquisition to include other bio-signals such as Heart Rate (HR), Respiratory Rate (RR), Body Temperature (BT), and Body Movements (BM). The addition of these bio-signals will enable a more complete analysis of the human physiological responses to emotional states or activation in an attempt to establish a correlation between them. Figure 4 shows these new acquired signals in two different situations, the left image corresponds to a set of stressful situations in a haunted house and the right one to a peaceful environment.

6 Conclusions

Being able to inhabit a virtual world through the same perspective as in real life unlocks the possibility for creating plausible enough experiences to be used as a tool in clinical therapy. Our tests have been confirming this claim by showing that different experiences can indeed elicit different physiological responses in the participant. These responses can be used by the therapist as an additional source of information about the evolution of the patient during the therapy sessions. This leads us to believe that our system can indeed help the aid of several psychopathologies.

References

  1. 1.
    Emmelkamp, P., Bouman, T., Scholing, A.: Anxiety Disorders: A Practitioner’s Guide, 1st edn. Wiley, Manhattan (1992)Google Scholar
  2. 2.
    Blascovich, J., Loomis, J., Beall, A.C., Swinth, K.R., Hoyt, C.L., Bailenson, J.N.: Immersive virtual environment technology as a methodological tool for social psychology. Psychol. Inq. 13(2), 103–124 (2002)CrossRefGoogle Scholar
  3. 3.
    Powers, M.B., Emmelkamp, P.: Virtual reality exposure therapy for anxiety disorders: a meta-analysis. J. Anxiety Disord. 22, 561–569 (2008)CrossRefGoogle Scholar
  4. 4.
    Ochsner, K.N., Knierim, K., Ludlow, D.H., Hanelin, J., Ramachandran, T., Glover, G., Mackey, S.C.: Reflecting upon feelings: an fMRI study of neural systems supporting the attribution of emotion to self and other. J. Cogn. Neurosci. 16(10), 1746–1772 (2004)CrossRefGoogle Scholar
  5. 5.
    Ruby, P., Decety, J.: How would you feel versus how do you think she would feel? A neuroimaging study of perspective-taking with social emotions. J. Cogn. Neurosci. 16(6), 988–999 (2004)CrossRefGoogle Scholar
  6. 6.
    Rizzo, A., Pair, J., Graap, K., Manson, B., McNerney, P.J., Wiederhold, B., Wiederhold, M., Spira, J.: A virtual reality exposure therapy: application for Iraq war military personnel with post-traumatic stress disorder: from training to toy to treatment. In: Roy, M. (ed.) NATO Advanced Research Workshop on Novel Approaches to the Diagnosis and Treatment of Posttraumatic Stress Disorder, pp. 235–250. IOS Press, Washington, D.C. (2004)Google Scholar
  7. 7.
    Sanchez-Vives, M.V., Slater, M.: From presence to consciousness through virtual reality. Nat. Rev. Neurosci. 6(4), 332–339 (2005)CrossRefGoogle Scholar
  8. 8.
    Pan, X., Gillies, M., Barker, C., Clark, D.M., Slater, M.: Socially anxious and confident men interact with a forward virtual woman: an experimental study. PLoS ONE 7(4), e32931 (2012)CrossRefGoogle Scholar
  9. 9.
    Riva, G., Mantovani, F., Capideville, C.S., Preziosa, A., Morganti, F., et al.: Affective interactions using virtual reality: the link between presence and emotions. CyberPsychol. Behav. 10(1), 45–56 (2007)CrossRefGoogle Scholar
  10. 10.
    Botvinick, M., Cohen, J.: Rubber hands ‘feel’ touch that eyes see. Nature 391, 756 (1998)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2016

Authors and Affiliations

  1. 1.Faculty of Sciences and Technology, Institute of Systems and Robotics (ISR-UC)University of CoimbraCoimbraPortugal
  2. 2.Faculty of Psychology and Education Sciences, Cognitive and Behavioural Centre for Research and Intervention (CINEICC)University of CoimbraCoimbraPortugal

Personalised recommendations