Collaborative immersive authoring tool for real-time creation of multisensory VR experiences

A Correction to this article was published on 21 January 2020

This article has been updated

Abstract

With the appearance of innovative virtual reality (VR) technologies, the need to create immersive content arose. Although there are already some non-immersive solutions to address immersive audio-visual content, there are no solutions that allow the creation of immersive multisensory content. This work proposes a novel architecture for a collaborative immersive tool that allows the creation of multisensory VR experiences in real-time, thus promoting the expeditious development, adoption, and use of immersive systems and enabling the building of custom-solutions that can be used in an intuitive manner to support organizations’ business initiatives. To validate the presented proposal, two approaches for the authoring tools (Desktop interface and Immersive interface) were subjected to a set of tests and evaluations consisting of a usability study that demonstrated not only the participants’ acceptance of the authoring tool but also the importance of using immersive interfaces for the creation of such VR experiences.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Change history

  • 21 January 2020

    In the original publication, <Emphasis Type="Bold">Figs. 1</Emphasis> and <Emphasis Type="Bold">2</Emphasis> were interchange and the citation of <Emphasis Type="Bold">Fig. 1</Emphasis> in the third paragraph of section <Emphasis Type="Bold">2.2 Authoring tools for multisensory VR experiences</Emphasis> should be removed.

References

  1. 1.

    Adobe (2017) Adobe Creative Cloud. https://www.adobe.com/creativecloud/video/360-vr-video-tools.html

  2. 2.

    Ahlberg G, Heikkinen T, Iselius L, Leijonmarck C-E, Rutqvist J, Arvidsson D (2002) Does training in a virtual reality simulator improve surgical performance? Surg Endosc 16(1):126–129

    Article  Google Scholar 

  3. 3.

    Akçayır M, Akçayır G (2017) Advantages and challenges associated with augmented reality for education: a systematic review of the literature. Educ Res Rev 20:1–11

    Article  Google Scholar 

  4. 4.

    Arnold P (2017) You better eat to survive! Exploring edible interactions in a virtual reality game. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, p 206–209

  5. 5.

    Marôco J (2018). Análise Estatística com o SPSS Statistics.: 7 edition. ReportNumber, Lda

  6. 6.

    BlueLabel Labs (2017) How much does it cost to create a virtual reality app?, Idea to Appster. [Online]. Available: https://www.bluelabellabs.com/ideatoappster/how-much-does-it-cost-to-create-a-virtual-reality-app/. Accessed 30 Jan 2018

  7. 7.

    Bouvier P (2008) The five pillars of presence: guidelines to reach presence. Spagn. GAMBERINI Éditeurs Proc. Presence, p 246–249

  8. 8.

    Brooke J (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194):4–7

  9. 9.

    Burdea GC, Coiffet P (2003) Virtual reality technology. Wiley, Hoboken

    Book  Google Scholar 

  10. 10.

    Coelho H, Melo M, Barbosa L, Martins J, Sérgio M, Bessa M (2018) Immersive Edition of multisensory 360 videos. Presented at the WorldCist’18 - 6th World Conference on Information Systems and Technologies. In: Proceedings of 6th World Conference on Information Systems and Technologies - WorldCist’18, Nápoles, p 309–318

  11. 11.

    CryTec (2017) CryEngine. [Online]. Available: https://www.cryengine.com/. Accessed 08 Nov 2017

  12. 12.

    de Barros PG, Lindeman RW (2013) Performance effects of multi-sensory displays in virtual teleoperation environments. In: Proceedings of the 1st Symposium on Spatial User Interaction, New York, NY, USA, p 41–48

  13. 13.

    Dinh HQ, Walker N, Hodges LF, Song C, Kobayashi A (1999) Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments. In: Proceedings IEEE Virtual Reality (Cat. No. 99CB36316), p 222–228

  14. 14.

    Engström E, Runeson P (2011) Software product line testing – a systematic mapping study. Inf Softw Technol 53(1):2–13

    Article  Google Scholar 

  15. 15.

    Epic Games (2017) Unreal Engine 4. [Online]. Available: https://www.unrealengine.com/en-US/blog. Accessed 08 Nov 2017

  16. 16.

    Feng M, Dey A, Lindeman RW (2016) An initial exploration of a multi-sensory design space: Tactile support for walking in immersive virtual environments. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), p 95–104

  17. 17.

    Freitas J, Meira C, Melo M, Barbosa L, Bessa M (2015) Information system for the management and visualization of multisensorial contents. In: 2015 10th Iberian Conference on Information Systems and Technologies (CISTI), Aveiro, Portugal, p 1–7

  18. 18.

    Fröhlich J, Wachsmuth I (2013) The visual, the auditory and the haptic – a user study on combining modalities in virtual worlds. In: Virtual augmented and mixed reality. Designing and developing augmented and virtual environments, p 159–168

  19. 19.

    Haque S, Srinivasan S (2006) A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans Inf Technol Biomed 10(1):51–58

    Article  Google Scholar 

  20. 20.

    Heilig ML “Sensorama simulator,” US3050870 A, 28 Aug 1962

  21. 21.

    Howard MC (2017) A meta-analysis and systematic literature review of virtual reality rehabilitation programs. Comput Hum Behav 70:317–327

    MathSciNet  Article  Google Scholar 

  22. 22.

    Huang H-M, Rauch U, Liaw S-S (2010) Investigating learners’ attitudes toward virtual reality learning environments: based on a constructivist approach. Comput Educ 55(3):1171–1182

    Article  Google Scholar 

  23. 23.

    Huang Y, Weng Y, Zhou M (2014) Modular design of urban traffic-light control systems based on synchronized timed petri nets. IEEE Trans Intell Transp Syst 15(2):530–539

    Article  Google Scholar 

  24. 24.

    IJsselsteijn WA, de Ridder H, Freeman J, Avons SE (2000) Presence: concept, determinants, and measurement. In: Human Vision and Electronic Imaging V, vol. 3959, p 520–530

  25. 25.

    Iwata N et al (2011) Construct validity of the LapVR virtual-reality surgical simulator. Surg Endosc 25(2):423–428

    Article  Google Scholar 

  26. 26.

    Jones S, Dawkins S (2018) The sensorama revisited: evaluating the application of multi-sensory input on the sense of presence in 360-degree immersive film in virtual reality. In: Augmented reality and virtual reality. Springer, Cham, p 183–197

    Google Scholar 

  27. 27.

    Lewis JR (1995) IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 7(1):57–78

    Article  Google Scholar 

  28. 28.

    Luigi M, Massimiliano M, Aniello P, Gennaro R, Virginia PR (2015) On the validity of immersive virtual reality as tool for multisensory evaluation of urban spaces. Energy Procedia 78:471–476

    Article  Google Scholar 

  29. 29.

    Manghisi VM et al (2017) Experiencing the sights, smells, sounds, and climate of southern Italy in VR. IEEE Comput Graph Appl 37(6):19–25

    Article  Google Scholar 

  30. 30.

    Marchand A, Hennig-Thurau T (2013) Value creation in the video game industry: industry economics, consumer benefits, and research opportunities. J Interact Mark 27(3):141–157

    Article  Google Scholar 

  31. 31.

    McCauley ME, Sharkey TJ (1992) Cybersickness: perception of self-motion in virtual environments. Presence Teleop Virt 1(3):311–318

    Article  Google Scholar 

  32. 32.

    McGregor C, Bonnis B, Stanfield B, Stanfield M (2017) Integrating big data analytics, virtual reality, and ARAIG to support resilience assessment and development in tactical training. In: 2017 IEEE 5th International Conference on Serious Games and Applications for Health (SeGAH), p 1–7

  33. 33.

    Mortara M, Catalano CE, Bellotti F, Fiucci G, Houry-Panchetti M, Petridis P (2014) Learning cultural heritage by serious games. J Cult Herit 15(3):318–325

    Article  Google Scholar 

  34. 34.

    Mujber TS, Szecsi T, Hashmi MSJ (2004) Virtual reality applications in manufacturing process simulation. J Mater Process Technol 155–156:1834–1838

    Article  Google Scholar 

  35. 35.

    Newbutt N, Sung C, Kuo HJ, Leahy MJ (2017) The acceptance, challenges, and future applications of wearable technology and virtual reality to support people with autism spectrum disorders. In: Recent advances in technologies for inclusive well-being. Springer, Cham, p 221–241

    Google Scholar 

  36. 36.

    Pan MKXJ, Niemeyer G (2017) Catching a real ball in virtual reality. In: 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, p 269–270

  37. 37.

    Pan Z, Cheok AD, Yang H, Zhu J, Shi J (2006) Virtual reality and mixed reality for virtual learning environments. Comput Graph 30(1):20–28

    Article  Google Scholar 

  38. 38.

    Passig D, Tzuriel D, Eshel-Kedmi G (2016) Improving children’s cognitive modifiability by dynamic assessment in 3D immersive virtual reality environments. Comput Educ 95:296–308

    Article  Google Scholar 

  39. 39.

    Portman ME, Natapov A, Fisher-Gewirtzman D (2015) To go where no man has gone before: virtual reality in architecture, landscape architecture and environmental planning. Comput Environ Urban Syst 54:376–384

    Article  Google Scholar 

  40. 40.

    Qu T, Lei S, Wang Z, Nie D, Chen X, Huang GQ (2016) IoT-based real-time production logistics synchronization system under smart cloud manufacturing. Int J Adv Manuf Technol 84(1–4):147–164

    Article  Google Scholar 

  41. 41.

    Sacau A, Laarni J, Hartmann T (2008) Influence of individual factors on presence. Comput Hum Behav 24(5):2255–2273

    Article  Google Scholar 

  42. 42.

    Schneider O, MacLean K, Swindells C, Booth K (2017) Haptic experience design: what hapticians do and where they need help. Int J Hum Comput Stud 107:5–21

    Article  Google Scholar 

  43. 43.

    Seymour NE et al (2002) Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 236(4):458–463; discussion 463-464

    Article  Google Scholar 

  44. 44.

    Slater M, Wilbur S (1997) A framework for immersive virtual environments five: speculations on the role of presence in virtual environments. Presence Teleop Virt 6(6):603–616

    Article  Google Scholar 

  45. 45.

    Slater M, Usoh M, Steed A (1994) Depth of presence in virtual environments. Presence Teleop Virt 3(2):130–144

    Article  Google Scholar 

  46. 46.

    Unity Technologies (2017) “Unity 2017.3.,” Unity Technologies. [Online]. Available: https://unity3d.com/pt. Accessed 08 Sep 2017

  47. 47.

    Wang J, Leach O, Lindeman RW (2013) DIY World Builder: an immersive level-editing system. In: 2013 IEEE Symposium on 3D User Interfaces (3DUI), p 195–196

  48. 48.

    Wang R, Yao J, Wang L, Liu X, Wang H, Zheng L (2017) A surgical training system for four medical punctures based on virtual reality and haptic feedback. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), p 215–216

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Hugo Coelho.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Experimental protocol

  • Open a new project and select the videos named “Capela Nova”.

  • Add a new sound stimulus and select the audio source named “Som Ambiente”. Name the new stimulus “Som Ambiente”; its start and finish times are to be 00:00 and 01:00, respectively. Then, place it in the location you want to be its origin.

  • Add a new wind stimulus, and name it “Vento Ambiente”. Its start and finish times are to be 00:05 and 00:30, respectively; choose the intensity that you feel is most comfortable for this environment by testing it. Then, place it in the location you want to be its origin.

  • Add a new smell stimulus, and name it “Cheiro Rosas”. Its start and finish times are to be 00:15 and 00:30, respectively; choose the “Capsule 2”, which is where the desired stimulus is; feel free to test it. Then, place it in the location you want to be its origin.

  • Add a new transducer stimulus, and name it “Vibração”. Its start and finish times are to be 00:30 and 00:45, respectively; choose the intensity that you feel most comfortable with for this environment by testing it. Then, place it in the location you want to be its origin.

  • Edit the smell stimulus named “Cheiro Rosas”, and change its start time to 00:25.

  • Edit the transducer stimulus named “Vibração”, and change its name to “Vibração Carro” and its end Time to 00:55.

  • Save the Project, and name it “Capela Nova”

  • Open the Project named “England” and play it.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Coelho, H., Melo, M., Martins, J. et al. Collaborative immersive authoring tool for real-time creation of multisensory VR experiences. Multimed Tools Appl 78, 19473–19493 (2019). https://doi.org/10.1007/s11042-019-7309-x

Download citation

Keywords

  • Collaborative
  • Multisensory
  • Virtual reality
  • Real-time
  • Authoring tool