Advertisement

Multimedia Tools and Applications

, Volume 75, Issue 16, pp 9549–9562 | Cite as

Enabling consistent hand-based interaction in mixed reality by occlusions handling

  • F. NarducciEmail author
  • S. Ricciardi
  • R. Vertucci
Article

Abstract

A mixed reality environment, namely the space resulting from displaying virtual contents co-registered to the real space, represents an effective paradigm for applying the potential of virtual reality in the everyday life instead of having it confined within a computer screen. In this context, gesture-based interaction seems to be the most suited approach for human-machine interfacing. However, in order that interaction to be visually consistent, the tridimensional composition of the virtual objects onto the real background should be per-formed respecting the distance of each rendered pixel according to the user viewpoint. This paper describes a simple yet effective hand/finger-based interaction system and a virtual-to-real occlusion-handling approach, able to process in real time the stereoscopic video see-through stream to achieve pixel-wise z-order info, crucial to evaluating whether each rendered pixel should be displayed. The experiments confirm the efficacy of the proposed method in a simulation context.

Keywords

Mixed reality Hand occlusion Disparity map 

References

  1. 1.
    Abate AF, Narducci F, Ricciardi S (2014) An image based approach to hand occlusions in mixed reality environments. In: Virtual, augmented and mixed reality. designing and developing virtual and augmented environments. Springer International Publishing, pp 319–328Google Scholar
  2. 2.
    Bernardet U, i Badia SB, Duff A, Inderbitzin M, Le Groux S, Manzolli J, Mathews Z, Mura A, Valjamae A, Verschure PFMJ (2010) The eXperience induction machine: a new paradigm for mixed-reality interaction design and psychological experimentation. In: The engineering of mixed reality systems. Springer London pp 357–379Google Scholar
  3. 3.
    Bichlmeier C, Heining SM, Feuerstein M, Navab N (2009) The virtual mirror: a new interaction paradigm for augmented reality environments. IEEE Trans Med Imaging 28(9):1498–1510CrossRefGoogle Scholar
  4. 4.
    Buchmann V, Violich S, Billinghurst M, Cockburn A (2004) FingARtips: gesture based direct manipulation in augmented reality. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques (GRAPHITE 2004), ACM, pp 212–221Google Scholar
  5. 5.
    Caggianese, G, Neroni P, Gallo L (2014) Natural interaction and wearable augmented reality for the enjoyment of the cultural heritage in outdoor conditions. In: Augmented and virtual reality. Springer International Publishing, pp 267–282Google Scholar
  6. 6.
    Corbett-Davies S, Dunser A, Green R, Clark A (2013) An advanced interaction framework for augmented reality based exposure treatment. In: IEEE Virtual Reality (VR 2013), IEEE, pp 19–22Google Scholar
  7. 7.
    Cosco F, Garre C, Bruno F, Muzzupappa M, Otaduy MA (2013) Visuo-haptic mixed reality with unobstructed tool-hand integration. IEEE Trans Vis Comput Graph 19(1):159–172CrossRefGoogle Scholar
  8. 8.
    Felzenszwalb PF, Huttenlocher DP (2006) Efficient belief propagation for early vision. Int J Comput Vis 70.1 41–54Google Scholar
  9. 9.
    Fischer J, Bartz D, Straßer W (2004) Occlusion handling for medical augmented reality using a volumetric phantom model. In Proceedings of the ACM symposium on Virtual reality software and technology (2004), ACM, pp 174–177Google Scholar
  10. 10.
    Furmanski C, Azuma R, Daily M (2002) Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. In: Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR 2002), IEEE, pp 215–320Google Scholar
  11. 11.
    Gordon G, Billinghurst M, Bell M, Woodfill J, Kowalik B, Erendi A, Tilander J (2002) The use of dense stereo range data in augmented reality. In: Proceedings of the 1st International Symposium on Mixed and Augmented Reality (2002), IEEE Computer Society, pp 14–23Google Scholar
  12. 12.
    Hirschmuller H (2008) Stereo processing by semiglobal matching and mutual information. IEEE Trans Pattern Anal Mach Intell 30(2):328–341. doi: 10.1109/TPAMI.2007.1166 CrossRefGoogle Scholar
  13. 13.
    Humenberger M, Zinner C, Weber M, Kubinger W, Vincze M (2010) A fast stereo matching algorithm suitable for embedded real-time systems. Comput Vis Image Underst 114(11):1180–1202CrossRefGoogle Scholar
  14. 14.
    Kanade T, Okutomi M (1994) A stereo matching algorithm with an adaptive window: theory and experiment. IEEE Trans Pattern Anal Mach Intell 920–932Google Scholar
  15. 15.
    Kantonen T, Woodward C, Katz N (2010) Mixed reality in virtual world teleconferencing. In: Virtual Reality Conference (VR), IEEE, pp 179–182Google Scholar
  16. 16.
    Lee W, Park J (2005) Augmented foam: a tangible augmented reality for product design. In Mixed and augmented reality. In: Proceedings of the Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), IEEE, pp. 106–109Google Scholar
  17. 17.
    Liu C, Huot S, Diehl J, Mackay WE, Beaudouin-Lafon M (2012) Evaluating the benefits of realtime feedback in mobile augmented reality with hand-held devices. CHI’12 - 30th International Conference on Human Factors in Computing Systems - 2012Google Scholar
  18. 18.
    Malkawi AM, Srinivasan RS (2005) A new paradigm for human-building interaction: the use of CFD and augmented reality. Autom Constr 14(1):71–84CrossRefGoogle Scholar
  19. 19.
    Medioni G, Nevatia R (1985) Segment-based stereo matching. Comput Vis Graph Image Process 2–18Google Scholar
  20. 20.
    Mistry P, Maes P (2009) SixthSense – a wearable gestural interface. In the Proceedings of SIGGRAPH Asia 2009, Sketch. Yokohama, JapanGoogle Scholar
  21. 21.
    Piumsomboon T, Clark A, Billinghurst M, Cockburn A (2013). User-defined gestures for augmented reality. In: Human-Computer Interaction–INTERACT 2013. Springer, Berlin Heidelberg pp 282–299Google Scholar
  22. 22.
    Sekuler AB, Palmer SE (1992) Perception of partly occluded objects: a microgenetic analysis. J Exp Psychol Gen 121:95–111CrossRefGoogle Scholar
  23. 23.
    Seo DW, Lee JY (2013) Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Exp Syst Appl 40(9):3784–3793CrossRefGoogle Scholar
  24. 24.
    Shah MM, Arshad H, Sulaiman R (2012) Occlusion in augmented reality. In: Proceedings of the 8th International Conference on Information Science and Digital Content Technology (ICIDT 2012), IEEE, pp 372–378Google Scholar
  25. 25.
    van Krevelen DWF, Poelman R (2010) A survey of augmented reality technologies, applications and limitations. Int J Virtual Reality 9(2):1–20Google Scholar
  26. 26.
    Walairacht S, Yamada K, Hasegawa S, Koike Y, Sato M (2002) 4+ 4 fingers manipulating virtual objects in mixed-reality environment. Presence: Teleoperators and Virtual Environments (2002), MIT Press J, pp 134–143Google Scholar
  27. 27.
    Wang RY, Popovi J (2009) Real-time hand- tracking with a color glove. Published in J ACM Trans Graph (TOG). Proceedings of ACM SIGGRAPH 2009, vol 28(3), ACM New York, NY, USAGoogle Scholar
  28. 28.
    Wei W, Yue Q, QingXing W (2011) An augmented reality application framework for complex equipment collaborative maintenance. Springer, Berlin/Heidelberg, vol 6874, pp 154–16, ISBN: 978-3-642- 23733–1Google Scholar
  29. 29.
    Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393CrossRefGoogle Scholar
  30. 30.
    Xu Gang, Zhang Z (1996) Epipolar geometry in stereo, motion and object recognition: a unified approach, vol 6. SpringerGoogle Scholar
  31. 31.
    Yang Q, Wang L, Yang R., Wang S, Liao M, Nister D (2006) Real-time global stereo matching using hierarchical belief propagation. In: The British Machine Vision Conference, 2006, pp 989–998Google Scholar
  32. 32.
    Zhang Z (2012) Microsoft kinect sensor and its effect. MultiMedia, IEEE 19(2):4–10CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.VRLabUniversity of SalernoFiscianoItaly
  2. 2.Selex Electronic SystemsGiuglianoItaly

Personalised recommendations