Abstract
Consistent illumination of virtual and real objects in augmented reality (AR) is essential to achieve visual coherence. This paper presents a practical method for rendering with consistent illumination in AR in two steps. In the first step, a user scans the surrounding environment by rotational motion of the mobile device and the real illumination is captured. We capture the real light in high dynamic range (HDR) to preserve its high contrast. In the second step, the captured environment map is used to precalculate a set of reflection maps on the mobile GPU which are then used for real-time rendering with consistent illumination. Our method achieves high quality of the reflection maps because the convolution of the environment map by the BRDF is calculated accurately per each pixel of the output map. Moreover, we utilize multiple render targets to calculate reflection maps for multiple materials simultaneously. The presented method for consistent illumination in AR is beneficial for increasing visual coherence between virtual and real objects. Additionally, it is highly practical for mobile AR as it uses only a commodity mobile device.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Azuma, R.T.: A survey of augmented reality. Presence Teleoperators Virtual Environ. 6, 355–385 (1997)
Knorr, S., Kurz, D.: Real-time illumination estimation from faces for coherent rendering. In: IEEE ISMAR 2014, pp. 113–122 (2014)
Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: IEEE ISMAR, pp. 119–128 (2012)
Meilland, M., Barat, C., Comport, A.: 3D high dynamic range dense visual slam and its application to real-time object re-lighting. In: IEEE ISMAR (2013)
Kán, P., Kaufmann, H.: High-quality reflections, refractions, and caustics in augmented reality and their contribution to visual coherence. In: IEEE ISMAR, pp. 99–108. IEEE Computer Society (2012)
Knecht, M., Traxler, C., Mattausch, O., Wimmer, M.: Reciprocal shading for mixed reality. Comput. Graph. 36, 846–856 (2012)
Grosch, T., Eble, T., Mueller, S.: Consistent interactive augmentation of live camera images with correct near-field illumination. In: ACM Symposium on Virtual Reality Software and Technology, pp. 125–132. ACM, New York (2007)
Kán, P., Kaufmann, H.: Differential irradiance caching for fast high-quality light transport between virtual and real worlds. In: IEEE ISMAR, pp. 133–141 (2013)
Franke, T.: Delta voxel cone tracing. In: IEEE ISMAR, pp. 39–44 (2014)
Franke, T.: Delta light propagation volumes for mixed reality. In: IEEE ISMAR, pp. 125–132 (2013)
Rohmer, K., Buschel, W., Dachselt, R., Grosch, T.: Interactive near-field illumination for photorealistic augmented reality on mobile devices. In: IEEE ISMAR, pp. 29–38 (2014)
Pessoa, S., Moura, G., Lima, J., Teichrieb, V., Kelner, J.: Photorealistic rendering for augmented reality: a global illumination and brdf solution. In: 2010 IEEE Virtual Reality Conference (VR), pp. 3–10 (2010)
Nowrouzezahrai, D., Geiger, S., Mitchell, K., Sumner, R., Jarosz, W., Gross, M.: Light factorization for mixed-frequency shadows in augmented reality. In: IEEE ISMAR, pp. 173–179 (2011)
Sato, I., Sato, Y., Ikeuchi, K.: Illumination from shadows. IEEE Trans. Pattern Anal. Mach. Intell. 25, 290–300 (2003)
Miller, G.S., Hoffman, C.R.: Illumination and reflection maps: simulated objects in simulated and real environments. In: SIGGRAPH 1984 (1984)
Kautz, J., Vzquez, P.P., Heidrich, W., Seidel, H.P.: A unified approach to prefiltered environment maps. In: Péroche, B., Rushmeier, H. (eds.) Rendering Techniques 2000, pp. 185–196. Springer, Vienna (2000)
Ramamoorthi, R., Hanrahan, P.: An efficient representation for irradiance environment maps. In: SIGGRAPH, pp. 497–500. ACM, New York (2001)
McGuire, M., Evangelakos, D., Wilcox, J., Donow, S., Mara, M.: Plausible Blinn-Phong reflection of standard cube MIP-maps. Technical report CSTR201301, Department of Computer Science, Williams College, USA (2013)
Scherzer, D., Nguyen, C.H., Ritschel, T., Seidel, H.P.: Pre-convolved Radiance Caching. Comput. Graph. Forum 4, 1391–1397 (2012)
Kautz, J., Daubert, K., Seidel, H.P.: Advanced environment mapping in VR applications. Comput. Graph. 28, 99–104 (2004)
Agusanto, K., Li, L., Chuangui, Z., Sing, N.W.: Photorealistic rendering for augmented reality using environment illumination. In: IEEE ISMAR, pp. 208–218 (2003)
Supan, P., Stuppacher, I., Haller, M.: Image based shadowing in real-time augmented reality. IJVR 5, 1–7 (2006)
Franke, T., Jung, Y.: Real-time mixed reality with GPU techniques. In: GRAPP, pp. 249–252 (2008)
Mehta, S.U., Kim, K., Pajak, D., Pulli, K., Kautz, J., Ramamoorthi, R.: Filtering environment illumination for interactive physically-based rendering in mixed reality. In: Eurographics Symposium on Rendering (2015)
Kán, P.: Interactive HDR environment map capturing on mobile devices. In: Eurographics 2015 - Short Papers, pp. 29–32. The Eurographics Association (2015)
Debevec, P.: Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: SIGGRAPH 1998, pp. 189–198. ACM, New York (1998)
Robertson, M., Borman, S., Stevenson, R.: Dynamic range improvement through multiple exposures. ICIP 3, 159–163 (1999)
Kajiya, J.T.: The rendering equation. In: SIGGRAPH, pp. 143–150 (1986)
Lafortune, E.P., Willems, Y.D.: Using the modified phong reflectance model for physically based rendering. Technical report, K.U. Leuven (1994)
Reinhard, E., Stark, M., Shirley, P., Ferwerda, J.: Photographic tone reproduction for digital images. ACM Trans. Graph. 21, 267–276 (2002)
Acknowledgements
The dragon model is the courtesy of Stanford Computer Graphics Laboratory. The teapot model is the courtesy of Martin Newell. This research was funded by Austrian project FFG-BRIDGE 843484.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Kán, P., Unterguggenberger, J., Kaufmann, H. (2015). High-Quality Consistent Illumination in Mobile Augmented Reality by Radiance Convolution on the GPU. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2015. Lecture Notes in Computer Science(), vol 9474. Springer, Cham. https://doi.org/10.1007/978-3-319-27857-5_52
Download citation
DOI: https://doi.org/10.1007/978-3-319-27857-5_52
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-27856-8
Online ISBN: 978-3-319-27857-5
eBook Packages: Computer ScienceComputer Science (R0)