Advertisement

Differential G-Buffer Rendering for Mediated Reality Applications

  • Tobias SchwandtEmail author
  • Wolfgang Broll
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10325)

Abstract

Physically-based approaches are increasingly used in a wide field of computer graphics. By that, modern graphic engines can provide a realistic output using physical correct values instead of an analytical approximation. Such applications apply the final lighting on a geometry buffer to reduce the complexity. Using this approach for Mediated Reality applications, some changes have to be made in order to fuse the real with the virtual world. In this paper, we present an approach with a focusing on the extraction of real world environment information and saving them directly to the geometry buffer. Therefore, we introduce a solution using spatial geometry to integrate the real world into the virtual environment. Hereby, the approach is usable in real-time and allows for visual interaction between virtual and real world objects. Moreover, a manipulation of the real world is easily possible.

Keywords

Mediated reality Mixed reality Augmented reality Differential rendering Physically based rendering 

References

  1. 1.
    Burley, B.: Practical physically based shading in film and game production. In: Physically-based shading at disney, part of ACM SIGGRAPH 2012 Course (2012)Google Scholar
  2. 2.
    Dachsbacher, C., Stamminger, M.: Reflective shadow maps. In: Proceedings of the 2005 Symposium on Interactive 3D Graphics and Games, pp. 203–231 (2005)Google Scholar
  3. 3.
    Debevec, P.: Rendering synthetic objects into real scenes. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH 1998, pp. 189–198. Association for Computing Machinery (ACM) (1998). http://dx.doi.org/10.1145/280814.280864
  4. 4.
    Fournier, A., Gunawan, A.S., Romanzin, C.: Common illumination between real and computer generated scenes. Technical report Vancouver, BC, Canada (1992)Google Scholar
  5. 5.
    Franke, T.A.: Delta light propagation volumes for mixed reality. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 125–132. Institute of Electrical and Electronics Engineers (IEEE), October 2013. http://dx.doi.org/10.1109/ISMAR.2013.6671772
  6. 6.
    Franke, T.A.: Delta voxel cone tracing. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 39–44. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948407
  7. 7.
    Gibson, S., Cook, J., Howard, T., Hubbold, R.: Rapid Shadow Generation in Real-World Lighting Environments. In: Dutre, P., Suykens, F., Christensen, P.H., Cohen-Or, D. (eds.) Eurographics Workshop on Rendering. The Eurographics Association (2003)Google Scholar
  8. 8.
    Gibson, S., Murta, A.: Interactive rendering with real-world Illumination. In: Pèroche, B., Rushmeier, H. (eds.) Rendering Techniques. Eurographics Workshop on Rendering, pp. 365–376. Springer, Vienna (2000). https://doi.org/10.1007%2F978-3-7091-6303-0_33 Google Scholar
  9. 9.
    Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 119–128. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402548
  10. 10.
    Gruber, L., Ventura, J., Schmalstieg, D.: Image-space illumination for augmented reality in dynamic environments. In: 2015 IEEE Virtual Reality (VR), pp. 127–134. Institute of Electrical and Electronics Engineers (IEEE), March 2015. http://dx.doi.org/10.1109/VR.2015.7223334
  11. 11.
    Han, Y., Lee, J.Y., Kweon, I.S.: High quality shape from a single RGB-D image under uncalibrated natural illumination. In: 2013 IEEE International Conference on Computer Vision, pp. 1617–1624. Institute of Electrical and Electronics Engineers (IEEE), December 2013. http://dx.doi.org/10.1109/ICCV.2013.204
  12. 12.
    Heitz, E., Dupuy, J., Hill, S., Neubelt, D.: Real-time polygonal-light shading with linearly transformed cosines. ACM Trans. Graph. 35(4), 41:1–41:8 (2016). http://doi.acm.org/10.1145/2897824.2925895 Google Scholar
  13. 13.
    Jachnik, J., Newcombe, R.A., Davison, A.J.: Real-time surface light-field capture for augmentation of planar specular surfaces. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 91–97. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402544
  14. 14.
    Kán, P., Kaufmann, H.: Differential progressive path tracing for high-quality previsualization and relighting in augmented reality. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Li, B., Porikli, F., Zordan, V., Klosowski, J., Coquillart, S., Luo, X., Chen, M., Gotz, D. (eds.) ISVC 2013. LNCS, vol. 8034, pp. 328–338. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-41939-3_32 CrossRefGoogle Scholar
  15. 15.
    Karis, B.: Real shading in unreal engine 4. In: ACM SIGGRAPH 2013 Course: Physically Based Shading in Theory and Practice (2013)Google Scholar
  16. 16.
    Knecht, M., Traxler, C., Mattausch, O., Purgathofer, W., Wimmer, M.: Differential instant radiosity for mixed reality. In: 2010 IEEE International Symposium on Mixed and Augmented Reality. Institute of Electrical and Electronics Engineers (IEEE), October 2010. https://doi.org/10.1109/ISMAR.2010.5643556
  17. 17.
    Knecht, M., Traxler, C., Mattausch, O., Wimmer, M.: Reciprocal shading for mixed reality. Comput. Graph. 36(7), 846–856 (2012). http://dx.doi.org/10.1016/j.cag.2012.04.013 CrossRefGoogle Scholar
  18. 18.
    Knecht, M., Traxler, C., Winklhofer, C., Wimmer, M.: Reflective and refractive objects for mixed reality. IEEE Trans. Visual. Comput. Graph. 19(4), 576–582 (2013). http://doi.acm.org/10.1145/2897824.2925895 CrossRefGoogle Scholar
  19. 19.
    Lagarde, S., Rousiers, C.D.: Moving frostbite to physically based rendering. In: ACM SIGGRAPH 2014 Course: Physically Based Shading in Theory and Practice (2014)Google Scholar
  20. 20.
    Lensing, P., Broll, W.: Fusing the real and the virtual: a depth-camera based approach to mixed reality. In: 2011 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 261–262. Institute of Electrical and Electronics Engineers (IEEE), October 2011. http://dx.doi.org/10.1109/ISMAR.2011.6143892
  21. 21.
    Lensing, P., Broll, W.: Instant indirect illumination for dynamic mixed reality scenes. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 109–118. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402547
  22. 22.
    Lensing, P., Broll, W.: LightSkin: Real-time global illumination for virtual and mixed reality. In: Proceedings of Joint Virtual Reality Conference of EGVE - EuroVR, pp. 17–24. The Eurographics Association (2013). http://dx.doi.org/10.2312/EGVE.JVRC13.017-024
  23. 23.
    Lombardi, S., Nishino, K.: Reflectance and natural illumination from a single image. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7577, pp. 582–595. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33783-3_42 CrossRefGoogle Scholar
  24. 24.
    Mann, S.: Mediated Reality. Technical report 260, M.I.T. Media Lab Perceptual Computing Section, Cambridge, Massachusetts (1994)Google Scholar
  25. 25.
    Mann, S.: Mediated reality. Linux J. 1999(59es), 5 (1999). http://dl.acm.org/citation.cfm?id=327697.327702 Google Scholar
  26. 26.
    Morgand, A., Tamaazousti, M., Bartoli, A.: An empirical model for specularity prediction with application to dynamic retexturing. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 44–53. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.13
  27. 27.
    Newcombe, R.A., Fox, D., Seitz, S.M.: DynamicFusion: reconstruction and tracking of non-rigid scenes in real-time. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 343–352. Institute of Electrical and Electronics Engineers (IEEE), June 2015. http://dx.doi.org/10.1109/CVPR.2015.7298631
  28. 28.
    Richter-Trummer, T., Kalkofen, D., Park, J., Schmalstieg, D.: Instant mixed reality lighting from casual scanning. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 27–36. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.18
  29. 29.
    Rohmer, K., Buschel, W., Dachselt, R., Grosch, T.: Interactive near-field illumination for photorealistic augmented reality on mobile devices. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 29–38. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948406
  30. 30.
    Rohmer, K., Grosch, T.: Tiled frustum culling for differential rendering on mobile devices. In: 2015 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 37–42. Institute of Electrical and Electronics Engineers (IEEE), September 2015. http://dx.doi.org/10.1109/ISMAR.2015.13
  31. 31.
    Salas-Moreno, R.F., Glocken, B., Kelly, P.H.J., Davison, A.J.: Dense planar SLAM. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 157–164. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948422
  32. 32.
    Schwandt, T., Broll, W.: A single camera image based approach for glossy reflections in mixed reality applications. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 37–43. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.12

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Ilmenau University of TechnologyIlmenauGermany

Personalised recommendations