Advertisement

Polarization-Based Illumination Detection for Coherent Augmented Reality Scene Rendering in Dynamic Environments

  • A’aeshah AlhakamyEmail author
  • Mihran TuceryanEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11542)

Abstract

A virtual object that is integrated into the real world in a perceptually coherent manner using the physical illumination information in the current environment is still under development. Several researchers investigated the problem producing a high-quality result; however, pre-computation and offline availability of resources were the essential assumption upon which the system relied. In this paper, we propose a novel and robust approach to identifying the incident light in the scene using the polarization properties of the light wave and using this information to produce a visually coherent augmented reality within a dynamic environment. This approach is part of a complete system which has three simultaneous components that run in real-time: (i) the detection of the incident light angle, (ii) the estimation of the reflected light, and (iii) the creation of the shading properties which are required to provide any virtual object with the detected lighting, reflected shadows, and adequate materials. Finally, the system performance is analyzed where our approach has reduced the overall computational cost.

Keywords

Augmented and mixed environments Interaction design Scene perception Texture perception 

References

  1. 1.
    Horn, B.K.: Obtaining shape from shading information. Psychol. Comput. Vis. 115–55 (1975)Google Scholar
  2. 2.
    Zerner, M.C.: Semiempirical Molecular Orbital Methods. Reviews in Computational Chemistry, pp. 313–365. Wiley, Hoboken (1991)Google Scholar
  3. 3.
    Debevec, P.E., Andmalik, J.: Recovering high dynamic range radiance maps from photographs. In: SIGGRAPH97, pp. 369–378 (1997)Google Scholar
  4. 4.
    Keller A.: Instant radiosity. In: Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques. ACM Press/Addison-Wesley Publishing Co., pp. 49–56 (1997)Google Scholar
  5. 5.
    Debevec, P.: Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1998, pp. 189–198. ACM, New York (1998). ISBN 0-89791-999-8Google Scholar
  6. 6.
    Chen, H., Wolff, L.B.: Polarization phase-based method for material classification in computer vision. Int. J. Comput. Vision 28(1), 73–83 (1998)CrossRefGoogle Scholar
  7. 7.
    Ramamoorthi, R., Hanrahan, P.: An efficient representation for irradiance environment maps. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 497–500 (2001)Google Scholar
  8. 8.
    Debevec, P.: Image-based lighting. IEEE Comput. Graphics Appl. 22(2), 26–34 (2002)CrossRefGoogle Scholar
  9. 9.
    Brom, J.M., Rioux, F.: Polarized light and quantum mechanics: an optical analog of the Stern-Gerlach experiment. Chem. Educ. 7(4), 200–204 (2002)CrossRefGoogle Scholar
  10. 10.
    Knecht, M., Traxler, C., Mattausch, O., Purgathofer, W., Wimmer, M.: Differential instant radiosity for mixed reality. In: 9th IEEE International Symposium Mixed and Augmented Reality (ISMAR), pp. 99–107 (2010)Google Scholar
  11. 11.
    Kán, P., Kaufmann, H.: High-quality reflections, refractions, and caustics in augmented reality and their contribution to visual coherence. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 99–108 (2012)Google Scholar
  12. 12.
    Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: IEEE International Symposium Mixed and Augmented Reality (ISMAR), pp. 119–128 (2012)Google Scholar
  13. 13.
    Jie, B.K.: Physics of Quantum Key Distribution, CS2107-Semester IV, 107 (2014–2015)Google Scholar
  14. 14.
    Gruber, L., Langlotz, T., Sen, P., Hoherer, T., Schmalstieg, D.: Efficient and robust radiance transfer for probeless photorealistic augmented reality. In: IEEE Virtual Reality (VR), pp. 15–20 (2014)Google Scholar
  15. 15.
    Ngo Thanh, T., Nagahara, H., Taniguchi, R.I.: Shape and light directions from shading and polarization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2310–2318 (2015)Google Scholar
  16. 16.
    Gruber, L., Ventura, J., Schmalstieg, D.: Image-space illumination for augmented reality in dynamic environments. In: Virtual Reality (VR), pp. 127–134 (2015)Google Scholar
  17. 17.
    Rhee, T., Petikam, L., Allen, B., Chalmers, A.: Mr360: mixed reality rendering for 360 panoramic videos. IEEE Trans. Visual. Comput. Graphics 4, 1379–1388 (2017)CrossRefGoogle Scholar
  18. 18.
    Fan, C.L., Lee, J., Lo, W.C., Huang, C.Y., Chen, K.T., Hsu, C.H.: Fixation prediction for 360 video streaming in head-mounted virtual reality. In: Proceedings of the 27th Workshop on Network and Operating Systems Support for Digital Audio and Video, pp. 67–72 (2017)Google Scholar
  19. 19.
    Shen, L., Zhao, Y., Peng, Q., Chan, J.C., Kong, S.G.: An iterative image dehazing method with polarization. IEEE Trans. Multimedia (2018)Google Scholar
  20. 20.
    Alhakamy, A., Tuceryan, M.: AR360: dynamic illumination for augmented reality with real-time interaction. In: 2019 IEEE 2nd International Conference on Information and Computer Technologies ICICT, pp. 170–175 (2019)Google Scholar
  21. 21.
    Alhakamy, A., Tuceryan, M.: CubeMap360: interactive global illumination for augmented reality in dynamic environment. In: IEEE SoutheastCon (2019). Accepted and PresentedGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Indiana University - Purdue University Indianapolis (IUPUI)IndianapolisUSA
  2. 2.University of Tabuk in TabukTabukSaudi Arabia

Personalised recommendations