Skip to main content
Log in

Exploring legibility of augmented reality X-ray

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Virtual objects can be visualized inside real objects using augmented reality (AR). This visualization is called AR X-ray because it gives the impression of seeing through the real object. In standard AR, virtual information is overlaid on top of the real world. To position a virtual object inside an object, AR X-ray requires partially occluding the virtual object with visually important regions of the real object. In effect, the virtual object becomes less legible compared to when it is completely unoccluded. Legibility is an important consideration for various applications of AR X-ray. In this research, we explored legibility in two implementations of AR X-ray, namely, edge-based and saliency-based. In our first experiment, we explored on the tolerable amounts of occlusion to comfortably distinguish small virtual objects. In our second experiment, we compared edge-based and saliency-based AR X-ray methods when visualizing virtual objects inside various real objects. Moreover, we benchmarked the legibility of these two methods against alpha blending. From our experiments, we observed that users have varied preferences for proper amounts of occlusion cues for both methods. The partial occlusions generated by the edge-based and saliency-based methods need to be adjusted depending on the lighting condition and the texture complexity of the occluding object. In most cases, users identify objects faster with saliency-based AR X-ray than with edge-based AR X-ray. Insights from this research can be directly applied to the development of AR X-ray applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://docs.opencv.org/.

  2. http://mplab.ucsd.edu/~nick/NMPT/.

References

  1. Avery B, Sandor C, Thomas B (2009) Improving spatial perception for augmented reality X-ray vision. In: Procceedings of IEEE virtual reality conference, pp 79–82

  2. Bajura M, Fuchs H, Ohbuchi R (1992) Merging virtual objects with the real world: seeing ultrasound imagery within the patient. In: Proceedings of ACM SIGGRAPH computer graphics, vol 26. ACM, pp 203–210

  3. Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M (2011) Augmented reality technologies, systems and applications. Multimed Tools Appl 51(1):341–377

    Article  Google Scholar 

  4. Dey A, Jarvis G, Sandor C, Reitmayr G (2012) Tablet versus phone: depth perception in handheld augmented reality. In: Proceedings of IEEE international symposium on mixed and augmented reality, pp 187–196

  5. Dey A, Sandor C (2014) Lessons learned: evaluating visualizations for occluded objects in handheld augmented reality. Int J Human-Comput Stud 72:704–716

    Article  Google Scholar 

  6. Furness TA (1986) The super cockpit and its human factors challenges. In: Proceedings of the human factors and ergonomics society annual meeting, vol 30. SAGE Publications, pp 48–52

  7. Gabbard J, Swan J (2008) Usability engineering for augmented reality: employing user-based studies to inform design. IEEE Trans Visual Comput Graph 14(3):513–525

    Article  Google Scholar 

  8. Gabbard J, Swan J, Hix D, Kim S J, Fitch G (2007) Active text drawing styles for outdoor augmented reality: a user-based study and design implications. In: IEEE virtual reality conference, pp 35–42

  9. Itti L, Koch C, Niebur E (1998) A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell 20(11):1254–1259

    Article  Google Scholar 

  10. Kalkofen D, Mendez E, Schmalstieg D (2007) Interactive focus and context visualization for augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 191–201

  11. Kalkofen D, Mendez E, Schmalstieg D (2009) Comprehensible visualization for augmented reality. IEEE Trans Vis Comput Graph 15(2):193–204

    Article  Google Scholar 

  12. Kalkofen D, Veas E, Zollmann S, Steinberger M, Schmalstieg D (2013) Adaptive ghosted views for augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 1–9

  13. Kameda Y, Takemasa T, Ohta Y (2004) Outdoor see-through vision utilizing surveillance cameras. In: IEEE international symposium on mixed and augmented reality, pp 151–160

  14. Kourouthanassis P, Boletsis C, Lekakos G (2013) Demystifying the design of mobile augmented reality applications. Multimed Tools Appl:1–22

  15. Livingston M (2005) Evaluating human factors in augmented reality systems. IEEE Comput Graph Appl 25(6):6–9

    Article  Google Scholar 

  16. Livingston M, Moser K (2013) Effectiveness of occluded object representations at displaying ordinal depth information in augmented reality. In: IEEE virtual reality conference, pp 107–108

  17. Livingston MA, Dey A, Sandor C, Thomas B H (2013) Pursuit of X-ray vision for augmented reality. In: Human factors in augmented reality environments. Springer, pp 67–107

  18. Nielsen J (1994) Usability engineering. Elsevier

  19. Peterson S, Axholt M, Cooper M, Ellis S (2009) Visual clutter management in augmented reality: effects of three label separation methods on spatial judgments. In: IEEE symposium on 3D user interfaces, pp 111–118

  20. Sandor C, Cunningham A, Dey A, Mattila V V (2010) An augmented reality X-ray system based on visual saliency. In: IEEE international symposium on mixed and augmented reality, pp 27–36

  21. Santos M, Chen A, Terawaki M, Yamamoto G, Taketomi T, Miyazaki J, Kato H (2013) Augmented reality X-ray interaction in k-12 education: theory, student perception and teacher evaluation. In: IEEE international conference on advanced learning technologies, pp 141–145

  22. Santos MEC, Chen A, Taketomi T, Yamamoto G, Miyazaki J, Kato H (2014) Augmented reality learning experiences: Survey of prototype design and evaluation. IEEE Trans Learn Technol 7(1):38–56

    Article  Google Scholar 

  23. Santos MEC, Polvi J, Taketomi T, Yamamoto G, Sandor C, Kato H (2014) A usability scale for handheld augmented reality. In: ACM symposium on virtual reality software and technology

  24. Santos MEC, Terawaki M, Taketomi T, Yamamoto G, Kato H (2015) Development of handheld augmented reality X-ray for k-12 settings. In: Chang M, Li Y (eds) Smart learning environments, lecture notes in educational technology. Springer, Berlin Heidelberg

    Google Scholar 

  25. Walther D (2006) Interactions of visual attention and object recognition: computational modeling, algorithms, and psychophysics. Ph.D. thesis. California Institute of Technology

  26. Zollmann S, Kalkofen D, Mendez E, Reitmayr G (2010) Image-based ghostings for single layer occlusions in augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 19–26

Download references

Acknowledgments

This work was supported by the Grant-in-Aid for JSPS Fellows, Grant Number 15J10186.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marc Ericson C. Santos.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Santos, M.E.C., de Souza Almeida, I., Yamamoto, G. et al. Exploring legibility of augmented reality X-ray. Multimed Tools Appl 75, 9563–9585 (2016). https://doi.org/10.1007/s11042-015-2954-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-015-2954-1

Keywords

Navigation