Advertisement

Depth Sensor Based Detection of Obstacles and Notification for Virtual Reality Systems

  • Peter WozniakEmail author
  • Antonio Capobianco
  • Nicolas Javahiraly
  • Dan Curticapean
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 973)

Abstract

Walking interfaces offer advantages in navigation of VE systems over other types of locomotion. However, VR helmets have the disadvantage that users cannot see their immediate surroundings. Our publication describes the prototypical implementation of a virtual environment (VE) system, capable of detecting possible obstacles using an RGB-D sensor. In order to warn users of potential collisions with real objects while they are moving throughout the VE tracking area, we designed 4 different visual warning metaphors: Placeholder, Rubber Band, Color Indicator and Arrow. A small pilot study was carried out in which the participants had to solve a simple task and avoid any arbitrarily placed physical obstacles when crossing the virtual scene. Our results show that the Placeholder metaphor (in this case: trees), compared to the other variants, seems to be best suited for the correct estimation of the position of obstacles and in terms of the ability to evade them.

Keywords

Virtual Reality Notifications Interaction metaphor Collision avoidance 3D interaction Navigation Walking workspace Range imaging RGB-D 

Notes

Acknowledgments

The authors would like to thank the participants of the study for their time and precious feedback.

References

  1. 1.
    Slater, M., Usoh, M., Steed, A.: Taking steps: the influence of a walking technique on presence in virtual reality. ACM Trans. Comput.-Hum. Interact. 2(3), 201–219 (1995)  https://doi.org/10.1145/210079.210084. Accessed Sept 1995CrossRefGoogle Scholar
  2. 2.
    Waller, D., Loomis, J.M., Haun, D.B.M.: Body-based senses enhance knowledge of directions in large-scale environments. Psychon. Bull. Rev. 11(1), 157–163 (2004)  https://doi.org/10.3758/BF03206476. Accessed 01 Feb 2004CrossRefGoogle Scholar
  3. 3.
    Waller, d., Bachmann, E., Hodgson, E., Beall, A.C.: The hive: a huge immersive virtual environment for research in spatial cognition. Behav. Res. Methods 39(4), 835–843 (2007).  https://doi.org/10.3758/BF03192976. Accessed 01 Nov 2007CrossRefGoogle Scholar
  4. 4.
    Usoh, M., Arthur, K., Whitton, M.C., Bastos, R., Steed, A., Slater, M., Brooks, Jr, F.P.: Walking> Walking-in-place> Flying, in Virtual Environments. In: Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 1999), pp. 359–364. ACM Press/Addison-Wesley Publishing Co., New York (1999).  https://doi.org/10.1145/311535.311589
  5. 5.
    Ruddle, R.A., Lessels, S.: The benefits of using a walking interface to navigate virtual environments. ACM Trans. Comput. Hum. Interact. 16(1), 5 (2009).  https://doi.org/10.1145/1502800.1502805CrossRefGoogle Scholar
  6. 6.
    Chance, S.S., Gaunet, F., Beall, A.C., Loomis, J.M.: Locomotion mode affects the updating of objects encountered during travel: the contribution of vestibular and proprioceptive inputs to path integration. Presence 7(2), 168178 (1998).  https://doi.org/10.1162/105474698565659(1998)CrossRefGoogle Scholar
  7. 7.
    Peck, T.C., Fuchs, H., Whitton, M.C.: The design and evaluation of a large-scale real-walking locomotion interface. IEEE Trans. Vis. Comput. Graph. 18(7), 1053–1067 (2012).  https://doi.org/10.1109/TVCG.2011.289CrossRefGoogle Scholar
  8. 8.
    Matthew, G.: Roomscale 101 - An Introduction to Roomscale VR (2017). https://blog.vive.com/us/2017/10/25/roomscale-101/. Accessed 19 June 2018
  9. 9.
  10. 10.
    Aaron, P., Zeller, M., Wojciakowski, M.: Mixed reality: enthusiast guide - FAQ (2017). https://docs.microsoft.com/enus/windows/mixed-reality/enthusiast-guide/before-you-buy-faqs. Accessed 19 June 2019
  11. 11.
    Razzaque, S.: Redirected walking. Ph.D. dissertation. Chapel Hill, NC, USA. Advisor(s) Brooks, Jr., Fredrick, P. AAI3190299 (2005)Google Scholar
  12. 12.
    Bachmann, E.R., Holm, J., Zmuda, M.A., Hodgson, E.: Collision prediction and prevention in a simultaneous two-user immersive virtual environment. In: 2013 IEEE Virtual Reality (VR), pp. 89–90 (2013).  https://doi.org/10.1109/VR.2013.6549377
  13. 13.
    Langbehn, E., Steinicke, F.: Redirected walking in virtual reality (2018). http://basilic.informatik.uni-hamburg.de/Publications/2018/LS18Google Scholar
  14. 14.
    Scavarelli, A., Teather, R.J.: VR collide! comparing collision avoidance methods between co-located virtual reality users. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA 2017), pp. 2915–2921. ACM, New York (2017).  https://doi.org/10.1145/3027063.3053180
  15. 15.
    Cirio, G., Marchal, M., Regia-Corte, T., Lcuyer, A.: The magic barrier tape: a novel metaphor for infinite navigation in virtual worlds with a restricted walking workspace. In: Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology (VRST 2009), pp. 155–162. ACM, New York (2009).  https://doi.org/10.1145/1643928.1643965
  16. 16.
    Cirio, G., Vangorp, P., Chapoulie, E., Marchal, M., Lecuyer, A., Drettakis, G.: Walking in a cube: novel metaphors for safely navigating large virtual environments in restricted real workspaces. IEEE Trans. Vis. Comput. Graph. 18(4), 546554 (2012).  https://doi.org/10.1109/TVCG.2012.60CrossRefGoogle Scholar
  17. 17.
    Hoffman, H.G.: Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments. In: Proceedings of the IEEE 1998 Virtual Reality Annual International Symposium (Cat. No. 98CB36180), pp. 59–63 (1998).  https://doi.org/10.1109/VRAIS.1998.658423
  18. 18.
    Simeone, A.L., Velloso, E., Gellersen, H.: Substitutional reality: using the physical environment to design virtual reality experiences. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI 2015), pp. 3307–3316. ACM, New York (2015).  https://doi.org/10.1145/2702123.2702389
  19. 19.
    Simeone, A.L.: The VR motion tracker: visualising movement of non-participants in desktop virtual reality experiences. In: 2016 IEEE 2nd Workshop on Everyday Virtual Reality (WEVR), vol. 00, p. 14 (2016).  https://doi.org/10.1109/WEVR.2016.7859535
  20. 20.
    Nescher, T., Zank, M., Kunz, A.: Simultaneous mapping and redirected walking for ad hoc free walking in virtual environments. In: 2016 IEEE Virtual Reality (VR), pp. 239–240 (2016).  https://doi.org/10.1109/VR.2016.7504742
  21. 21.
    Sra, M., Garrido-Jurado, S., Schmandt, C., Maes, P.: Procedurally generated virtual reality from 3D reconstructed physical space. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology (VRST 2016), pp. 191–200. ACM, New York (2016).  https://doi.org/10.1145/2993369.2993372
  22. 22.
    Ghosh, S., et al.: NotifiVR: exploring interruptions and notifications in virtual reality. IEEE Trans. Vis. Computer Graph. 24(4), 1447–1456 (2018).  https://doi.org/10.1109/TVCG.2018.2793698(2018)CrossRefGoogle Scholar
  23. 23.
    OpenVR API Documentation (2017). https://github.com/ValveSoftware/openvr/wiki/API-Documentation. Accessed 3 June 2018
  24. 24.
    Unity download archive (2018). https://unity3d.com/getunity/download/archive. Accessed 19 June 2018
  25. 25.
    HTC Vive Tracker (2018). https://www.vive.com/de/vive-tracker/. Accessed 3 June 2018
  26. 26.
    Steward, J., Lichti, D., Chow, J., Ferber, R., Osis, S.: Performance assessment and calibration of the Kinect 2.0 time-of-flight range camera for use in motion capture applications (7692). FIG Working week 2015 From the Wisdom of the Ages to the Challenges of the Modern World Sofia, Bulgaria, 17–21 May 2015 (2015)Google Scholar
  27. 27.
    Breuer, T., Bodensteiner, C., Arens, M.: Low-cost commodity depth sensor comparison and accuracy analysis. In: Proceedings of SPIE 9250, Electro-Optical Remote Sensing, Photonic Technologies, and Applications VIII; and Military Applications in Hyperspectral Imaging and High Spatial Resolution Sensing II, 13 October 2014, p. 92500G (2014)Google Scholar
  28. 28.
    Point Cloud Library 2018. Point Cloud Library (PCL) (2018). http://docs.pointclouds.org/trunk. Accessed 20 June 2018
  29. 29.
    Munaro, M., Basso, F., Menegatti, E.: Tracking people within groups with RGBD data. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2101–2107 (2012).  https://doi.org/10.1109/IROS.2012.6385772
  30. 30.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. In: Readings in Computer Vision, pp. 726–740. Elsevier (1987)Google Scholar
  31. 31.
    Kennedy, Robert S., Lane, Norman E., Berbaum, Kevin S., Lilienthal, Michael G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3(3), 203–220 (1993).  https://doi.org/10.1207/s15327108ijap03033(1993)CrossRefGoogle Scholar
  32. 32.
    Slater, M., Usoh, M., Steed, A.: Depth of presence in virtual environments. Presence: Teleoperators Virtual Environ. 3(2), 130–144 (1994)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Peter Wozniak
    • 1
    • 2
    Email author
  • Antonio Capobianco
    • 1
  • Nicolas Javahiraly
    • 1
  • Dan Curticapean
    • 2
  1. 1.I-Cube - University of StrasbourgStrasbourgFrance
  2. 2.Offenburg UniversityOffenburgGermany

Personalised recommendations