Advertisement

Calibrating, Rendering and Evaluating the Head Mounted Light Field Display

  • Anne Juhler HansenEmail author
  • Jákup Klein
  • Martin Kraus
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 983)

Abstract

There are several benefits of using a light field display over a traditional HMD; in particular the light field can avoid the vergence-accommodation conflict and can also correct for near- and farsightedness. By rendering only four corner cameras of a subimage array, then these four views can be interpolated in order to create all subimages of the light field. We implement the interpolation of the subimages in the light field with the use of pixel reprojection, while maintaining correct perspective and shading. We give an comprehensive explanation of the construction and calibration of a head mounted light field display, and finally we evaluate the image quality through image difference and conduct a user evaluation of the light field images in order to evaluate if users are able to perceive a difference in the light field images created with the full array of virtual cameras and our method using four cameras and pixel reprojection. In most cases the users were unable to distinguish the images, and we conclude that pixel reprojection is a feasible method for rendering light fields as far as quality is concerned.

References

  1. 1.
    Hoffman, D.M., Girshick, A.R., Akeley, K., Banks, M.S.: Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. J. Vis. 8, 33 (2008)CrossRefGoogle Scholar
  2. 2.
    Hansen, A.J., Kraus, M., Klein, J.: Light field rendering for head mounted displays using pixel reprojection. In: International Conference on Computer Graphics Theory and Applications (2017)Google Scholar
  3. 3.
    Charman, W.N.: The eye in focus: accommodation and presbyopia. Clin. Exp. Optom. 91, 207–225 (2008)CrossRefGoogle Scholar
  4. 4.
    Suryakumar, R., Meyers, J.P., Irving, E.L., Bobier, W.R.: Vergence accommodation and monocular closed loop blur accommodation have similar dynamic characteristics. Vis. Res. 47, 327–337 (2007)CrossRefGoogle Scholar
  5. 5.
    Schor, C.M., Alexander, J., Cormack, L., Stevenson, S.: Negative feedback control model of proximal convergence and accommodation. Ophthalmic Physiol. Opt. 12, 307–318 (1992)CrossRefGoogle Scholar
  6. 6.
    Shibata, T., Kim, J., Hoffman, D.M., Banks, M.S.: Visual discomfort with stereo displays: effects of viewing distance and direction of vergence-accommodation conflict. In: Stereoscopic Displays and Applications XXII, vol. 7863, p. 78630P. International Society for Optics and Photonics (2011)Google Scholar
  7. 7.
    Kramida, G.: Resolving the vergence-accommodation conflict in head-mounted displays. IEEE Trans. Vis. Comput. Graph. 22, 1912–1931 (2016)CrossRefGoogle Scholar
  8. 8.
    Lanman, D., Luebke, D.: Near-eye light field displays. ACM Trans. Graph. (TOG) 32, 220 (2013)CrossRefGoogle Scholar
  9. 9.
    Adelson, E.: The plenoptic function and the elements of early vision, computational models of visual. In: Landy, M., Movshon, J.A. Processing, Chap. 1 (1991)Google Scholar
  10. 10.
    Levoy, M., Hanrahan, P.: Light field rendering. In: Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques, pp. 31–42. ACM (1996)Google Scholar
  11. 11.
    Wetzstein, G., Lanman, D., Hirsch, M., Raskar, R.: Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting (2012)CrossRefGoogle Scholar
  12. 12.
    Hirsch, M., Wetzstein, G., Raskar, R.: A compressive light field projection system. ACM Trans. Graph. (TOG) 33, 58 (2014)CrossRefGoogle Scholar
  13. 13.
    Jones, A., McDowall, I., Yamada, H., Bolas, M., Debevec, P.: Rendering for an interactive 360 light field display. ACM Trans. Graph. (TOG) 26, 40 (2007)CrossRefGoogle Scholar
  14. 14.
    Huang, F.C., Chen, K., Wetzstein, G.: The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Trans. Graph. (TOG) 34, 60 (2015)Google Scholar
  15. 15.
    Rolland, J., Hua, H.: Head-mounted display systems. Encycl. Opt. Eng. 1, 1–13 (2005)Google Scholar
  16. 16.
    Lanman, D., Luebke, D.: Supplementary material: near-eye light field displays (2013)Google Scholar
  17. 17.
    Shaoulov, V., Martins, R., Rolland, J.P.: Compact microlenslet-array-based magnifier. Opt. Lett. 29, 709–711 (2004)CrossRefGoogle Scholar
  18. 18.
    Wilburn, B.S., Smulski, M., Lee, H.H.K., Horowitz, M.A.: Light field video camera. In: Media Processors 2002, vol. 4674, pp. 29–37. International Society for Optics and Photonics (2001)Google Scholar
  19. 19.
    Schirmacher, H., Ming, L., Seidel, H.P.: On-the-fly processing of generalized lumigraphs. In: Computer Graphics Forum, vol. 20, pp. 165–174. Wiley Online Library (2001)Google Scholar
  20. 20.
    Zhang, C., Chen, T.: Light field capturing with lensless cameras. In: IEEE International Conference on Image Processing, ICIP 2005, vol. 3, p. III-792. IEEE (2005)Google Scholar
  21. 21.
    Ng, R., Levoy, M., Brédif, M., Duval, G., Horowitz, M., Hanrahan, P.: Light field photography with a hand-held plenoptic camera. Comput. Sci. Tech. Rep. CSTR 2, 1–11 (2005)Google Scholar
  22. 22.
    Georgiev, T., Zheng, K.C., Curless, B., Salesin, D., Nayar, S.K., Intwala, C.: Spatio-angular resolution tradeoffs in integral photography. Rendering Tech. 2006, 263–272 (2006)Google Scholar
  23. 23.
    Naimark, M., Woodfill, J., Debevec, P., Villareal, L.: Immersion 94. Interval Research Corporation Image-Based Modeling and Rendering Project from Summer (1994)Google Scholar
  24. 24.
    Davis, A., Levoy, M., Durand, F.: Unstructured light fields. In: Computer Graphics Forum, vol. 31, pp. 305–314. Wiley Online Library (2012)Google Scholar
  25. 25.
    Ju Jeong, Y., Sung Chang, H., Ho Cho, Y., Nam, D., Jay Kuo, C.C.: 13.3: efficient direct light-field rendering for autostereoscopic 3D displays. In: SID Symposium Digest of Technical Papers, vol. 46, pp. 155–159. Wiley Online Library (2015)CrossRefGoogle Scholar
  26. 26.
    Gupta, K., Kazi, S., Kong, T.: Dynamic this is a test eye gaze tracking for foveated rendering and retinal blur (2015)Google Scholar
  27. 27.
    Guenter, B., Finch, M., Drucker, S., Tan, D., Snyder, J.: Foveated 3D graphics. ACM Trans. Graph. (TOG) 31, 164 (2012)CrossRefGoogle Scholar
  28. 28.
    Nehab, D., Sander, P.V., Lawrence, J., Tatarchuk, N., Isidoro, J.R.: Accelerating real-time shading with reverse reprojection caching. Graph. Hardw. 41, 61–62 (2007)Google Scholar
  29. 29.
    Havran, V., Damez, C., Myszkowski, K., Seidel, H.P.: An efficient spatio-temporal architecture for animation rendering. In: ACM SIGGRAPH 2003 Sketches and Applications, p. 1. ACM (2003)Google Scholar
  30. 30.
    Adelson, S.J., Hodges, L.F.: Generating exact ray-traced animation frames by reprojection. IEEE Comput. Graph. Appl. 15, 43–52 (1995)CrossRefGoogle Scholar
  31. 31.
    Tawara, T., Myszkowski, K., Seidel, H.P.: Exploiting temporal coherence in final gathering for dynamic scenes. In: Proceedings of the Computer Graphics International, pp. 110–119. IEEE (2004)Google Scholar
  32. 32.
    Sitthi-amorn, P., Lawrence, J., Yang, L., Sander, P.V., Nehab, D., Xi, J.: Automated reprojection-based pixel shader optimization. ACM Trans. Graph. (TOG) 27, 127 (2008)CrossRefGoogle Scholar
  33. 33.
    Kang, S.B.: Geometrically valid pixel reprojection methods for novel view synthesis. ISPRS J. Photogramm. Remote Sens. 53, 342–353 (1998)CrossRefGoogle Scholar
  34. 34.
    Kraus, M.: Quasi-convolution pyramidal blurring. JVRB-J. Virtual Reality Broadcast. 6 (2009). Article no. 6. https://www.jvrb.org/past-issues/6.2009/1821/citation
  35. 35.
    Cunningham, D.W., Wallraven, C.: Experimental Design: From User Studies to Psychophysics. CRC Press, Boca Raton (2011)CrossRefGoogle Scholar
  36. 36.
    McKee, S.P., Klein, S.A., Teller, D.Y.: Statistical properties of forced-choice psychometric functions: implications of probit analysis. Percept. Psychophys. 37, 286–298 (1985)CrossRefGoogle Scholar
  37. 37.
    Borg, M., Johansen, S.S., Krog, K.S., Thomsen, D.L., Kraus, M.: Using a graphics turing test to evaluate the effect of frame rate and motion blur on telepresence of animated objects. In: GRAPP/IVAPP, pp. 283–287 (2013)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Anne Juhler Hansen
    • 1
    Email author
  • Jákup Klein
    • 1
  • Martin Kraus
    • 1
  1. 1.Aalborg UniversityAalborgDenmark

Personalised recommendations