Skip to main content
Log in

Photon mapping with visible kernel domains

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Despite the strong efforts made in the last three decades, lighting simulation systems still remain prone to various types of imprecisions. This paper specifically tackles the problem of biases due to density estimation used in photon mapping approaches. We study the fundamental aspects of density estimation and exhibit the need for handling visibility in the early stage of the kernel domain definition. We show that properly managing visibility in the density estimation process allows to reduce or to remove biases all at once. In practice, we have implemented a 3D product kernel based on a polyhedral domain, with both point-to-point and point-to-surface visibility computation. Our experimental results illustrate the enhancements produced at every stage of density estimation, for direct photon maps visualization and progressive photon mapping.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Bekaert, P., Slusallek, P., Cools, R., Havran, V., Seidel, H.P.: A custom designed density estimation method for light transport. Technical Report MPI-I-2003-4-004, Max-Planck-Institut für Informatik, Saarbrücken (2003)

  2. Chen, L.H., Tsai, T.C., Chen, Y.S.: Grouped photon mapping. Vis. Comput. 26, 217–226 (2010)

    Article  Google Scholar 

  3. Dammertz, H., Hanika, J., Keller, A.: Shallow bounding volume hierarchies for fast SIMD ray tracing of incoherent rays. Comput. Graph. Forum 27(4), 1225–1233 (2008)

    Article  Google Scholar 

  4. Dutre, P., Bala, K., Bekaert, P., Shirley, P.: Advanced Global Illumination. AK Peters Ltd, Natick (2006)

    Book  Google Scholar 

  5. Georgiev, I., Křivánek, J., Davidovič, T., Slusallek, P.: Light transport simulation with vertex connection and merging. ACM TOG Proc. SIGGRAPH Asia 31(6), 192 (2012)

    Google Scholar 

  6. Hachisuka, T., Ogaki, S., Jensen, H.W.: Progressive photon mapping. ACM TOG Proc. ACM SIGGRAPH Asia 27(5), 30 (2008)

    Google Scholar 

  7. Hachisuka, T., Pantaleoni, J., Jensen, H.W.: A path space extension for robust light transport simulation. ACM TOG Proc. SIGGRAPH Asia 31(6), 191 (2012)

    Google Scholar 

  8. Haumont, D., Mäkinen, O., Nirenstein, S.: A low dimensional framework for exact polygon-to-polygon occlusion queries. In: Eurographics Symposium on Rendering Techniques, pp. 211–222 (2005)

  9. Havran, V., Bittner, J., Herzog, R., Seidel, H.P.: Ray maps for global illumination. In: Eurographics Symposium on Rendering Techniques, pp. 43–54 (2005)

  10. Herzog, R.: Advanced density estimation techniques for global illumination. Master’s thesis, Universität des Saarlandes (MPI Informatik) (2005)

  11. Izenman, A.: Modern Multivariate Statistical Techniques, Regression, Classification, and Manifold Learning. Springer, New-York (2008)

    MATH  Google Scholar 

  12. Jensen, H.W.: Global illumination using photon maps. In: Eurographics Workshop on Rendering Techniques, pp. 21–30 (1996)

  13. Jensen, H.W.: Realistic Image Synthesis Using Photon Mapping. A. K. Peters, Ltd, Natick (2001)

    Book  MATH  Google Scholar 

  14. Jones, M.C.: Simple boundary correction for kernel density estimation. Stat. Comput. 3(3), 135–146 (1993)

    Article  Google Scholar 

  15. Kajiya, J.T.: The rendering equation. In: ACM SIGGRAPH (1986)

  16. Knaus, C., Zwicker, M.: Progressive photon mapping: a probabilistic approach. ACM TOG 30(3), 25 (2011)

    Article  Google Scholar 

  17. Lastra, M., Ureña, C., Revelles, J., Montes, R.: A particle-path based method for Monte-Carlo density estimation. In: Eurographics Workshop on Rendering Techniques (2002)

  18. Lavignotte, F., Paulin, M.: A new approach of density estimation for global illumination. In: WSCG, Plzen , Czech Republic, pp. 263–273 (2002)

  19. Lavignotte, F., Paulin, M.: Scalable photon splatting for global illumination. In: GRAPHITE ’03. ACM, New York (2003)

  20. Mora, F., Aveneau, L., Apostu, O.L., Ghazanfarpour, D.: Lazy visibility evaluation for exact soft shadows. Comput. Graph. Forum 31(1), 132–145 (2012)

    Article  Google Scholar 

  21. Nirenstein, S., Blake, E., Gain, J.: Exact from-region visibility culling. In: Eurographics Workshop on Rendering Techniques (2002)

  22. Pharr, M., Humphreys, G.: Physically Based Rendering, Second Edition: From Theory to Implementation, 2nd edn. Morgan Kaufmann Publishers Inc., San Francisco (2010)

    Google Scholar 

  23. Qin, H., Sun, X., Hou, Q., Guo, B., Zhou, K.: Unbiased photon gathering for light transport simulation. ACM TOG Proc. SIGGRAPH 34(6), 208 (2015)

    Google Scholar 

  24. Schregle, R.: Bias compensation for photon maps. Comput. Graph. Forum 22, 729–742 (2003)

    Article  Google Scholar 

  25. Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, Boca Raton (1998)

    Google Scholar 

  26. Teller, S., Hanrahan, P.: Global visibility algorithms for illumination computations. In: ACM SIGGRAPH (1993)

  27. Tobler, R.F., Maierhofer, S.: Improved illumination estimation for photon maps in architectural scenes. In: WSCG pp. 257–261 (2006)

  28. Veach, E.: Robust Monte–Carlo methods for light transport simulation. Ph.D. thesis, Stanford University (1997)

  29. Wald, I., Woop, S., Benthin, C., Johnson, G.S., Ernst, M.: Embree: a kernel framework for efficient CPU ray tracing. ACM TOG Proc. ACM SIGGRAPH 33(4), 143 (2014)

    Google Scholar 

  30. Wand, M.P., Jones, M.C.: Kernel Smoothing, Monographs on Statistics and Applied Probability, vol. 60. Chapman and Hall, London (1995)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel Meneveaux.

Appendices

Appendices

A Insight into density estimation

This section discusses several aspects of density estimation and the importance of consistency with kernel density estimators [11]. In the univariate case, given n independent identically distributed observations \(x_1, x_2,\ldots ,x_n\) of a given function f, the kernel density estimator \(\langle f_h(x) \rangle \) of f(x) is:

$$\begin{aligned} \langle f_h(x) \rangle =\frac{1}{nh}\sum _{i=1}^n\;K_1\left( \frac{x-x_i}{h}\right) ,\quad x\in \mathbb {R},\;h>0, \end{aligned}$$
(10)

where \(K_1\) is a 1D kernel function (with mean 0 and integral equal to 1) and h is the kernel bandwidth. In the following, we are mainly interested by kernels with a compact domain, i.e., kernels \(K_1\), such that

$$\begin{aligned} K_1(x)=0,\, \forall x\not \in [-1,+1]. \end{aligned}$$

The multivariate kernel density estimator of f is:

$$\begin{aligned} \langle f_{\mathbf {H}}(\mathbf {x}) \rangle =\frac{1}{n\left| \mathbf {H}\right| }\sum _{i=1}^n\;K_d\left( \mathbf {H}^{-1}\left( \mathbf {x}-\mathbf {x}_i\right) \right) ,\quad \mathbf {x}\in \mathbb {R}^d, \end{aligned}$$
(11)

where \(\mathbf {H}\) is a \((d\times d)\) nonsingular matrix generalizing the bandwidth and \(K_d\) is a multivariate kernel function.

A popular representation for multivariate kernels consists in using product kernels, corresponding to product of the same univariate kernel function:

$$\begin{aligned} K_d(\mathbf {y})=\prod _{j=1}^d\;K_1(y_j),\quad \forall \mathbf {y}= [y_1,\ldots ,y_d], \end{aligned}$$
(12)

defined on a polyhedral domain.

Kernels are generally chosen with the following properties:

$$\begin{aligned}&\forall \mathbf {y}\in \mathbb {R}^d,\;K_d(\mathbf {y}) \ge 0\quad \text {(positive)}, \end{aligned}$$
(13)
$$\begin{aligned}&\int _{\mathbb {R}^d}K_d(\mathbf {y})\;\mathrm {d}\mathbf {y} = 1\quad \text {(normalized)}, \end{aligned}$$
(14)
$$\begin{aligned}&\int _{\mathbb {R}^d}\mathbf {y} K_d(\mathbf {y})\;\mathrm {d}\mathbf {y} = 0\quad \text {(symmetric)}, \end{aligned}$$
(15)
$$\begin{aligned}&\mu _2(K_d)\mathbf {I}_d =\int _{\mathbb {R}^d}\mathbf {y}\mathbf {y}^T K_d(\mathbf {y})\;\mathrm {d}\mathbf {y} \quad \text {(bounded)}, \end{aligned}$$
(16)

where \(\mu _2(K_d)=\int y_i^2K_d(\mathbf {y})\mathrm {d}\mathbf {y}\) is independent of \(i\in [1\ldots d]\), \(\mu _2(K_d)<\infty \), and \(\mathbf {I}_d\) is the \(d\times d\) identity matrix.

Equations 13 and 14 ensure that the kernel can also be seen as a normalized probability density function; Eq. 15 states that \(K_d\) is a symmetric function, which is generally a good property for a convolution, except near support boundaries; Eq. 16 is only useful to express the bias as a function over a finite value \(\mu _2(K_d)\).

Density estimation is intrinsically biased. The bias is defined as:

$$\begin{aligned} b\left( \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \right) = E\left\{ \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \right\} - f(\mathbf {x}), \end{aligned}$$

where \(E\lbrace \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \rbrace \) is the expected value of function \(\langle f_{\mathbf {H}}(\mathbf {x}) \rangle \). When the estimator is biased, the expected value does not tend to the actual function:

$$\begin{aligned} b\left( \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \right) \ne 0 \quad \Rightarrow \quad E\left\{ \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \right\} \ne f(\mathbf {x}). \end{aligned}$$

In practice, users are mostly interested in density estimators that are consistent: \(\forall \mathbf {x}\in \mathbb {R}^d\), \(b\left( \langle f_{\mathbf {H}}(\mathbf {x}) \rangle \right) {\scriptscriptstyle ^{_\rightarrow }}0\), when all values of \(\mathbf {H}\) tend to 0.

A good trade-off between bias and variance is difficult to find since increasing the bandwidth h allows to reduce variance but increases bias; conversely, decreasing h reduces bias but increases variance.

B Boundary bias in statistics

Boundary bias is well known in statistics. When the estimation domain is bounded, bias increases at the boundary, leading to inconsistent density estimation. This can be explained using a Taylor expansion (see [30] for the multivariate case, exhibiting the same behavior). In the univariate case, a kernel defined on a compact domain \([-1,+1]\) leads to the following bias expression:

$$\begin{aligned} b(\langle f_h(x) \rangle )&= f(x)\int _{-h}^{+h} K_1\left( \frac{z}{h}\right) \,\mathrm {d}z-f(x) \end{aligned}$$
(17)
$$\begin{aligned}&-hf'(x)\int _{-h}^{+h} zK_1\left( \frac{z}{h}\right) \,\mathrm {d}z \end{aligned}$$
(18)
$$\begin{aligned}&+\frac{h^2}{2}f''(x)\int _{-h}^{+h}z^2K_1\left( \frac{z}{h}\right) \,\mathrm {d}z+o(h^2). \end{aligned}$$
(19)

Using Eq. 14 cancels the first two terms, Eq. 15 removes the third one, and the condition corresponding to Eq. 16 allows to bound the last term and to conclude that bias is proportional to \(h^2\). Finally, when h decreases down to 0, the bias also tends to 0: The estimator is consistent.

Let us now consider a function f defined over \([0,+\infty ]\). Density estimation is performed using Eq. 10 for \(0<x<h\), and the previous bias approximation leads to:

$$\begin{aligned} b(\langle f_h(x) \rangle )&= f(x)\int _0^{+h} K_1\left( \frac{z}{h}\right) \,\mathrm {d}z-f(x)\\&\quad -hf'(x)\int _0^{+h} zK_1\left( \frac{z}{h}\right) \,\mathrm {d}z \\&\quad +\frac{h^2}{2}f''(x)\int _0^{+h}z^2K_1\left( \frac{z}{h}\right) \,\mathrm {d}z+o(h^2). \end{aligned}$$

In this case, Eqs. 13, 14, 15, 16 do not ensure a consistent estimator anymore. Hence, the bias becomes a function of f(x), which cannot be decreased with more observations.

In order to ensure a consistent estimation, Jones proposes to normalize the kernel at each estimation point [14]; in other words, a normalization factor is calculated for each estimated point x:

$$\begin{aligned} \mathbb {K}_1(x) = \int _{-h}^{+x} K_1\left( \frac{x-z}{h}\right) \,\mathrm {d}z, \end{aligned}$$
(20)

and density estimation becomes:

$$\begin{aligned} \langle f_h(x) \rangle =\frac{1}{nh\mathbb {K}_1(x)}\sum _{i=1}^n\;K_1\left( \frac{x-x_i}{h}\right) . \end{aligned}$$
(21)

Consequently, density bias is in \(\mathcal O(h)\) and thus consistent.

The generalization in any dimension d of this normalization process requires to determine the normalization factor \(\mathbb {K}_d(\mathbf {x})\) resulting from the kernel integration over a domain \(\mathcal {D}=\{ \mathcal {D}_1 \times \mathcal {D}_2 \times \cdots \times \mathcal {D}_d \}\):

$$\begin{aligned} \mathbb {K}_d(\mathbf {x})&= \int _{\mathcal {D}} K_d( u_1,\ldots ,u_d ) \;\mathrm {d}u_1\ldots \mathrm {d}u_d \nonumber \\&= \int _{\mathcal {D}_d}\ldots \int _{\mathcal {D}_1} K_1(u_1)\;\mathrm {d}u_1\ldots \mathrm {d}u_d, \end{aligned}$$
(22)

where each sub-domain \(\mathcal {D}_i\) is orthogonal to the others, which makes it separable and thus easier to estimate.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Perrot, R., Aveneau, L., Mora, F. et al. Photon mapping with visible kernel domains. Vis Comput 35, 707–720 (2019). https://doi.org/10.1007/s00371-018-1505-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-018-1505-y

Keywords

Navigation