Advertisement

Computational Visual Media

, Volume 5, Issue 1, pp 73–89 | Cite as

Image-based appearance acquisition of effect coatings

  • Jiří FilipEmail author
  • Radomír Vávra
Open Access
Research Article
  • 59 Downloads

Abstract

Paint manufacturers strive to introduce unique visual effects to coatings in order to visually communicate functional properties of products using value-added, customized design. However, these effects often feature complex, angularly dependent, spatially-varying behavior, thus representing a challenge in digital reproduction. In this paper we analyze several approaches to capturing spatially-varying appearances of effect coatings. We compare a baseline approach based on a bidirectional texture function (BTF) with four variants of half-difference parameterization. Through a psychophysical study, we determine minimal sampling along individual dimensions of this parameterization. We conclude that, compared to BTF, bivariate representations better preserve visual fidelity of effect coatings, better characterizing near-specular behavior and significantly the restricting number of images which must be captured.

Keywords

effect coatings measurement bidirectional texture function (BTF) appearance psychophysical experiment 

Notes

Acknowledgements

The authors would like to thank Frank J. Maile from Schlenk Metallic Pigments GmbH for sample preparation and inspiring discussions, our colleague Martina Kolafová for organization and running of psychophysical experiments, and all anonymous subjects for the time they devoted to participation in visual experiments. This research was supported by Czech Science Foundation grant 17-18407S.

Supplementary material

41095_2019_134_MOESM1_ESM.avi (67.7 mb)
Supplementary material, approximately 67.7 MB.
41095_2019_134_MOESM2_ESM.pdf (11.6 mb)
Image-based Appearance Acquisition of Effect Coatings

References

  1. [1]
    Nicodemus, F. E.; Richmond, J. C.; Hsia, J. J.; Ginsberg, I. W.; Limperis, T. Geometrical considerations and nomenclature for reflectance. In: Radiometry. Jones and Bartlett Publishers, Inc., 94–145, 1992.Google Scholar
  2. [2]
    Dana, K. J.; van Ginneken, B.; Nayar, S. K.; Koenderink, J. J. Reflectance and texture of real-world surfaces. ACM Transactions on Graphics Vol. 18, No. 1, 1–34, 1999.CrossRefGoogle Scholar
  3. [3]
    Rusinkiewicz, S. M. A new change of variables for efficient BRDF representation. In: Rendering Techniques’ 98. Drettakis, G.; Max, N. Eds. Springer Vienna, 11–22, 1998.Google Scholar
  4. [4]
    Romeiro, F.; Vasilyev, Y.; Zickler, T. Passive reflectometry. In: Computer Vision — ECCV 2008. Lecture Notes in Computer Science, Vol. 5305. Forsyth, D.; Torr, P.; Zisserman, A. Eds. Springer Berlin Heidelberg, 859–872, 2008.CrossRefGoogle Scholar
  5. [5]
    McAuley, S.; Hill, S.; Hoffman, N.; Gotanda, Y.; Smits, B.; Burley, B.; Martinez, A. Practical physically-based shading in film and game production. In: Proceedings of the ACM SIGGRAPH 2012 Courses, Article No. 10, 2012.CrossRefGoogle Scholar
  6. [6]
    Burley, B. BRDF related resources. 2018. Available at https://doi.org/wiki.nuaj.net/index.php?title=BRDF.Google Scholar
  7. [7]
    Filip, J.; Vávra, R.; Maile, F. J. BRDF measurement of highly-specular materials using a goniometer. In: Proceedings of the 33rd Spring Conference on Computer Graphics, Article No. 13, 2017.CrossRefGoogle Scholar
  8. [8]
    Ward, G.; Kurt, M.; Bonneel, N. Reducing anisotropic BSDF measurement to common practice. In: Proceedings of the Eurographics 2014 Workshop on Material Appearance Modeling: Issues and Acquisition, 5–8, 2014.Google Scholar
  9. [9]
    Ferrero, A.; Rabal, A.; Campos, J.; Martínez-Verdú, F.; Chorro, E.; Perales, E.; Pons, A.; Hernanz, M. L Spectral BRDF-based determination of proper measurement geometries to characterize color shift of special effect coatings. Journal of the Optical Society of America A Vol. 30, No. 2, 206–214, 2013.CrossRefGoogle Scholar
  10. [10]
    Strothkämper, C.; Hauer, K.-O.; Höpe, A. How to efficiently characterize special effect coatings. Journal of the Optical Society of America A Vol. 33, No. 1, 1–8, 2016CrossRefGoogle Scholar
  11. [11]
    Vávra, R.; Filip, J. Minimal sampling for effective acquisition of anisotropic BRDFs. Computer Graphics Forum Vol. 35, No. 7, 299–309, 2016.CrossRefGoogle Scholar
  12. [12]
    Vávra, R.; Filip, J. Adaptive slices for acquisition of anisotropic BRDF. Computational Visual Media Vol. 4, No. 1, 55–69, 2018.CrossRefGoogle Scholar
  13. [13]
    Dong, Y.; Wang, J. P.; Tong, X.; Snyder, J.; Lan, Y. X.; Ben-Ezra, M.; Guo, B. N. Manifold bootstrapping for SVBRDF capture. ACM Transactions on Graphics Vol. 29, No. 4, Article No. 98, 2010.Google Scholar
  14. [14]
    Aittala, M.; Weyrich, T.; Lehtinen, J. Practical SVBRDF capture in the frequency domain. ACM Transactions on Graphics Vol. 32, No. 4, Article No. 110, 2013.Google Scholar
  15. [15]
    Aittala, M.; Timo, A. L.; Lehtinen, J. Reflectance modeling by neural texture synthesis. ACM Transactions on Graphics Vol. 35, No. 4, Article No. 65, 2016.Google Scholar
  16. [16]
    Jakob, W.; Hašan, M.; Yan, L. Q.; Lawrence, J.; Ramamoorthi, R.; Marschner, S. Discrete stochastic microfacet models. ACM Transactions on Graphics Vol. 33, No. 4, Article No. 115, 2014.Google Scholar
  17. [17]
    Den Brok, D.; Weinmann, M.; Klein, R. Rapid material capture through sparse and multiplexed measurements. Computers and Graphics Vol. 73, 26–36, 2018.CrossRefGoogle Scholar
  18. [18]
    Velinov, Z.; Hullin, M. B. An interactive appearance model for microscopic fiber surfaces. In: Proceedings of the Conference on Vision, Modeling and Visualization, 145–152, 2016.Google Scholar
  19. [19]
    Günther, J.; Chen, T.; Goesele, M.; Wald, I.; Seidel, H.-P. Efficient acquisition and realistic rendering of car paint. In: Vision, Modeling, and Visualization. Greiner, G.; Hornegger, J.; Niemann, H.; Stamminger, M. Eds. Akademische Verlagsgesellschaft Aka, 487–494, 2005.Google Scholar
  20. [20]
    Ershov, S.; Kolchin, K.; Myszkowski, K. Rendering pearlescent appearance based on paint-composition modelling. Computer Graphics Forum Vol. 20, No. 3, 227–238, 2001.CrossRefGoogle Scholar
  21. [21]
    Ďrikovič, R.; Martens, W. L. Simulation of sparkling and depth effect in paints. In: Proceedings of the 19th Spring Conference on Computer Graphics, 193–198, 2003.Google Scholar
  22. [22]
    Ergun, S.; Önel, S.; Ozturk, A. A general micro-flake model for predicting the appearance of car paint. In: Proceedings of the Eurographics Symposium on Rendering: Experimental Ideas & Implementations, 65–71, 2016.Google Scholar
  23. [23]
    Mihálik, A.; Ďrikovič, R. Metallic paint appearance measurement and rendering. Journal of the Applied Mathematics, Statistics and Informatics Vol. 9, No. 2, 25–39, 2013.CrossRefGoogle Scholar
  24. [24]
    Rump, M.; Müller, G.; Sarlette, R.; Koch, D.; Klein, R. Photo-realistic rendering of metallic car paint from image-based measurements. Computer Graphics Forum Vol. 27, No. 2, 527–536, 2008.CrossRefGoogle Scholar
  25. [25]
    Rump, M.; Sarlette, R.; Klein, R. Efficient resampling, compression and rendering of metallic and pearlescent paint. In: Proceedings of the Vision, Modeling, and Visualization, 11–18, 2009.Google Scholar
  26. [26]
    Golla, T.; Klein, R. An efficient statistical data representation for real-time rendering of metallic effect car paints In: Virtual Reality and Augmented Reality. Lecture Notes in Computer Science, Vol. 10700. Barbic, J.; D’Cruz, M.; Latoschik, M.; Slater, M.; Bourdot, P. Eds. Springer Cham, 51–68, 2017.CrossRefGoogle Scholar
  27. [27]
    Golla, T.; Klein, R. Interactive interpolation of metallic effect car paints. In: Proceedings of Vision, Modeling and Visualization, 2018.Google Scholar
  28. [28]
    Kautz, J.; Boulos, S.; Durandk, F. Interactive editing and modeling of bidirectional texture functions. ACM Transactions on Graphics Vol. 26, No. 3, Article No. 53, 2007.Google Scholar
  29. [29]
    Xu, K.; Wang, J. P.; Tong, X.; Hu, S.-M.; Guo, B. N. Edit propagation on bidirectional texture functions. Computer Graphics Forum Vol. 28, No. 7, 1871–1877, 2009.CrossRefGoogle Scholar
  30. [30]
    Yan, L.-Q.; Hašan, M.; Jakob, W.; Lawrence, J.; Marschner, S.; Ramamoorthi, R. Rendering glints on high-resolution normal-mapped specular surfaces. ACM Transactions on Graphics Vol. 33, No. 4, Article No. 116, 2014.Google Scholar
  31. [31]
    Xu, K.; Sun, W.-L.; Dong, Z.; Zhao, D.-Y.; Wu, R.-D.; Hu, S.-M. Anisotropic spherical Gaussians. ACM Transactions on Graphics Vol. 32, No. 6, Article No. 209, 2013.Google Scholar
  32. [32]
    Matusik, W.; Pfister, H.; Brand, M.; McMillan, L. A data-driven reflectance model. ACM Transactions on Graphics Vol. 22, No. 3, 759–769, 2003.CrossRefGoogle Scholar
  33. [33]
    Pellacini, F.; Ferwerda, J. A.; Greenberg, D. P. Toward a psychophysically-based light reflection model for image synthesis. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, 55–64, 2000.Google Scholar
  34. [34]
    Serrano, A.; Gutierrez, D.; Myszkowski, K.; Seidel, H.-P.; Masia, B. An intuitive control space for material appearance. ACM Transactions on Graphics Vol. 35, No. 6, Article No. 186, 2016.Google Scholar
  35. [35]
    Filip, J.; Chantler, M. J.; Green, P. R.; Haindl, M. A psychophysically validated metric for bidirectional texture data reduction. ACM Transactions on Graphics Vol. 27, No. 5, Article No. 138, 2008.Google Scholar
  36. [36]
    Jarabo, A.; Wu, H. Z.; Dorsey, J.; Rushmeier, H.; Gutierrez, D. Effects of approximate filtering on the appearance of bidirectional texture functions. IEEE Transactions on Visualization and Computer Graphics Vol. 20, No. 6, 880–892, 2014.CrossRefGoogle Scholar
  37. [37]
    Ward, G. J. Measuring and modeling anisotropic reflection. ACM SIGGRAPH Computer Graphics Vol. 26, No. 2, 265–272, 1992.MathSciNetCrossRefGoogle Scholar
  38. [38]
    Cook, R. L.; Torrance, K. E. A reflectance model for computer graphics. ACM Transactions on Graphics Vol. 1, No. 1, 7–24, 1982.CrossRefGoogle Scholar
  39. [39]
    Marschner, S. R. Inverse rendering for computer graphics. Ph.D. Thesis. Cornell University, 1998.Google Scholar
  40. [40]
    Aydin, T. O.; Mantiuk, R.; Seidel, H.-P. Extending quality metrics to full luminance range images. In: Proceedings of the SPIE 6806, Human Vision and Electronic Imaging XIII, 68060B, 2008.Google Scholar
  41. [41]
    Maile, F. J.; Pfaff, G.; Reynders, P. Effect pigments: Past, present and future. Progress in Organic Coatings Vol. 54, No. 3, 150–163, 2005.CrossRefGoogle Scholar
  42. [42]
    Filip, J.; Vávra, R.; Haindl, M.; Zid, P.; Krupicka, M.; Havran, V. BRDF slices: Accurate adaptive anisotropic appearance acquisition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1468–1473, 2013.Google Scholar
  43. [43]
    Somol, P.; Haindl, M. Novel path search algorithm for image stitching and advanced texture tiling. In: Proceedings of the 13th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, 155–162, 2005.Google Scholar
  44. [44]
    Sattler, M.; Sarlette, R.; Klein, R. Efficient and realistic visualization of cloth. In: Proceedings of the 14th Eurographics Workshop on Rendering, 167–178, 2003.Google Scholar
  45. [45]
    Palmer, C. A.; Loewen, E. G. Diffraction Grating Handbook, 6th edn. New York: Newport Corporation, 2005.Google Scholar
  46. [46]
    Cochran, W. G. The comparison of percentages in matched samples. Biometrika Vol. 37, No. 3/4, 256–266, 1950.MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© The Author(s) 2019

This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visit https://doi.org/creativecommons.org/licenses/by/4.0/.

Other papers from this open access journal are available free of charge from https://doi.org/www.springer.com/journal/41095. To submit a manuscript, please go to https://doi.org/www.editorialmanager.com/cvmj.

Authors and Affiliations

  1. 1.The Czech Academy of Sciences, Institute of Information Theory and AutomationPraha 8Czech Republic

Personalised recommendations