Advertisement

Instantaneous foveated preview for progressive Monte Carlo rendering

  • 257 Accesses

Abstract

Progressive rendering, for example Monte Carlo rendering of 360° content for virtual reality headsets, is a time-consuming task. If the 3D artist notices an error while previewing the rendering, they must return to editing mode, make the required changes, and restart rendering. We propose the use of eye-tracking-based optimization to significantly speed up previewing of the artist’s points of interest. The speed of the preview is further improved by sampling with a distribution that closely follows the experimentally measured visual acuity of the human eye, unlike the piecewise linear models used in previous work. In a comprehensive user study, the perceived convergence of our proposed method was 10 times faster than that of a conventional preview, and often appeared to be instantaneous. In addition, the participants rated the method to have only marginally more artifacts in areas where it had to start rendering from scratch, compared to conventional rendering methods that had already generated image content in those areas.

References

  1. [1]

    Pharr, M.; Jakob, W.; Humphreys, G. Physically Based Rendering: From Theory to Implementation, 2nd edn. Morgan Kaufmann, 2010.

  2. [2]

    Strasburger, H.; Rentschler, I.; Jüttner, M. Peripheral vision and pattern recognition: A review. Journal of Vision Vol. 11, No. 5, 13, 2011.

  3. [3]

    Koskela, M.; Viitanen, T.; Jääskeläinen, P.; Takala, J. Foveated path tracing. In: Advances in Visual Computing. Lecture Notes in Computer Science, Vol. 10072. Bebis, G.; Boyle, R.; Parvin, B. et al. Eds. Springer Cham, 723–732, 2016

  4. [4]

    Koskela, M.; Immonen, K.; Viitanen, T.; Jääskeläinen, P.; Multanen, J.; Takala, J. Foveated instant preview for progressive rendering. In: Proceedings of the SIGGRAPH Asia 2017 Technical Briefs, Article No. 10, 2017.

  5. [5]

    Weier, M.; Stengel, M.; Roth, T.; Didyk, P.; Eisemann, E.; Eisemann, M.; Grogorick, S.; Hinkenjann, A.; Kruijff, E.; Magnor, M.; Myszkowski, K.; Slusallek, P. Perception-driven accelerated rendering. Computer Graphics Forum Vol. 36, No. 2, 611–643, 2017

  6. [6]

    Shibata, T. Head mounted display. Displays Vol. 23, Nos. 1–2, 57–64, 2002

  7. [7]

    Lee, E. C.; Park, K. R. A robust eye gaze tracking method based on a virtual eyeball model. Machine Vision and Applications Vol. 20, No. 5, 319–337, 2009

  8. [8]

    Guenter, B.; Finch, M.; Drucker, S.; Tan, D.; Snyder, J. Foveated 3D graphics. ACM Transactions on Graphics Vol. 31, No. 6, Article No. 164, 2012.

  9. [9]

    Vaidyanathan, K.; Salvi, M.; Toth, R.; Foley, T.; Akenine-Möller, T.; Nilsson, J.; Munkberg, J.; Hasselgren, J.; Sugihara, M.; Clarberg, P.; Janczak, T.; Lefohn, A. Coarse pixel shading. In: Proceedings of High Performance Graphics, 9–18, 2014

  10. [10]

    Stengel, M.; Grogorick, S.; Eisemann, M.; Magnor, M. Adaptive image-space sampling for gaze-contingent real-time rendering. Computer Graphics Forum Vol. 35, No. 4, 129–139, 2016

  11. [11]

    Weier, M.; Roth, T.; Kruijff, E.; Hinkenjann, A.; Pérard-Gayot, A.; Slusallek, P.; Li, Y. Foveated real-time ray tracing for head-mounted displays. Computer Graphics Forum Vol. 35, No. 7, 289–298, 2016

  12. [12]

    Murphy, H. A.; Duchowski, A. T.; Tyrrell, R. A. Hybrid image/model-based gaze-contingent rendering. ACM Transactions on Applied Perception Vol. 5, No. 4, Article No. 22, 2009.

  13. [13]

    Reddy, M. Perceptually optimized 3D graphics. IEEE Computer Graphics and Applications Vol. 21, No. 5, 68–75, 2001

  14. [14]

    Pohl, D.; Zhang, X.; Bulling, A. Combining eye tracking with optimizations for lens astigmatism in modern wideangle HMDs. In: Proceedings of the IEEE Virtual Reality, 269–270, 2016

  15. [15]

    Roth, T.; Weier, M.; Maiero, J.; Hinkenjann, A.; Li, Y. Guided high-quality rendering. In: Advances in Visual Computing. Lecture Notes in Computer Science, Vol. 9475. Bebis, G.; Boyle, R.; Parvin, B. et al. Eds. Springer Cham, 115–125, 2015

  16. [16]

    Pixar. Renderman 20 documentation: Rendering efficiently. 2017. Available at https://doi.org/renderman.pixar.com/resources/RenderMan_20/tutorialRenderingEfficiently.html.

  17. [17]

    The community of LuxRender. LuxRender documentation: Refine brush. 2013. Available at https://doi.org/www.luxrender.net/wiki/Refine_Brush.

  18. [18]

    Duchowski, A. T.; Bate, D.; Stringfellow, P.; Thakur, K.; Melloy, B. J.; Gramopadhye, A. K. On spatiochromatic visual sensitivity and peripheral color LOD management. ACM Transactions on Applied Perception Vol. 6, No. 2, Article No. 9, 2009.

  19. [19]

    Viitanen, T.; Koskela, M.; Immonen, K.; Mäkitalo, M.; Jääskeläinen, P.; Takala, J. Sparse sampling for real-time ray tracing. In: Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Vol. 1, 295–302, 2018.

  20. [20]

    Devroye, L. Non-Uniform Random Variate Generation. Springer-Verlag, 1986.

  21. [21]

    Weisstein, E. Lambert W-function. Available at https://doi.org/mathworld.wolfram.com/LambertW-Function.html.

  22. [22]

    MacQuarrie, A.; Steed, A. Cinematic virtual reality: Evaluating the effect of display type on the viewing experience for panoramic video. In: Proceedings of the IEEE Virtual Reality, 45–54, 2017.

  23. [23]

    Stark, R.; Israel, J. H.; Wöhler, T. Towards hybrid modelling environments—Merging desktop-CAD and virtual reality-technologies. CIRP Annals Vol. 59, No. 1, 179–182, 2010.

  24. [24]

    AMD. Radeon Rays SDK. Available at https://doi.org/github.com/GPUOpen-LibrariesAndSDKs/RadeonRays_SDK.

  25. [25]

    Rodrigo. What is the latency of FOVE eye tracking? Available at https://doi.org/support.getfove.com/hc/enus/articles/115000733714-What-is-the-Latency-of-FOVE-Eye-Tracking-.

  26. [26]

    FOVE Inc. Tech specs. Available at https://doi.org/www.getfov.com/.

Download references

Acknowledgements

The authors would like to thank the creators of the 3D models used in the user study: Christophe Seux for Classroom, Anat Grynberg and Greg Ward for Conference, Marko Dabrovic for Sibenik (License: CC BY-NC), and Frank Meinl for Crytek Sponza (License: CC BY 3.0). In addition, the authors would like to thank Heli Väätäjä, Chelsea Kelling, and Otto Kauhanen for helpful discussions.

Author information

Correspondence to Matias K. Koskela.

Additional information

Matias K. Koskela received his bachelor and master degrees with honors from Tampere University of Technology in 2014 and 2015, respectively. His research interests include optimizations and parallelism in real-time rendering, especially in real-time ray tracing.

Kalle V. Immonen received his M.Sc. (Tech.) degree in pervasive systems from Tampere University of Technology (TUT) in 2017. He is now working as a project researcher at TUT. His research interests include computer graphics methods and algorithms.

Timo T. Viitanen received his M.Sc. degree in embedded systems from Tampere University of Technology (TUT) in 2013, and is now a graduate student in the Laboratory of Pervasive Computing, TUT. He is the recipient of a TUT graduate school position and was awarded a Nokia Scholarship in 2014. His research interests include computer architecture and computer graphics.

Pekka O. Jääskeläinen (Adjunct Professor) has worked on heterogeneous platform customization and programming since the early 2000s. He has published over 60 academic papers in journals and conferences, and is an active contributor to various heterogeneous parallel platform related open source projects. In addition to his ongoing research on methods and tools to reduce the engineering effort involved in design and programming of heterogeneous platforms, he is interested in next generation programmable graphics architectures for photorealistic real-time rendering in the context of small form factor VR/AR products of the future.

Joonas I. Multanen received his M.Sc. degree in electrical engineering from Tampere University of Technology (TUT) in 2015. He is currently a graduate student at the Laboratory of Pervasive Computing, TUT. His research interests include energy efficient computer architectures and computer graphics.

Jarmo H. Takala received his M.Sc. (hons) degree in electrical engineering and Dr.Tech. degree in information technology from Tampere University of Technology (TUT), Tampere, Finland, in 1987 and 1999, respectively. From 1992 to 1995, he was a research scientist at VTT-Automation, Tampere, Finland. Between 1995 and 1996, he was a senior research engineer at Nokia Research Center, Tampere, Finland. From 1996 to 1999, he was a researcher at TUT. Since 2000, he has been a professor in computer engineering at TUT and is currently vice president of TUT. Dr. Takala is Co-Editor-in-Chief for Journal of Signal Processing Systems. During 2007–2011 he was Associate Editor and Area Editor for IEEE Transactions on Signal Processing. His research interests include circuit techniques, parallel architectures, and design methodologies.

Electronic supplementary material

Supplementary material, approximately 101 MB.

Supplementary material, approximately 101 MB.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Koskela, M.K., Immonen, K.V., Viitanen, T.T. et al. Instantaneous foveated preview for progressive Monte Carlo rendering. Comp. Visual Media 4, 267–276 (2018). https://doi.org/10.1007/s41095-018-0113-0

Download citation

Keywords

  • foveated rendering
  • progressive rendering
  • Monte Carlo rendering
  • preview
  • 360° content